AI News, Difference between revisions of "Basic Computer Security/Introduction"
- On 4. oktober 2018
- By Read More
Difference between revisions of "Basic Computer Security/Introduction"
This book is written for a reader with little to no previous knowledge of security issues, but one who is familiar with the basic functionality of his or her computer's operating system.
Reading this book should give you a basic understanding of the processes needed to secure your home computer and home network, as well as protect your privacy and data on the web.
16px (Also visible on the top right of the page.) This feature will facilitate returning to the index to move on to the next section, to return to this introduction, or to re-orient yourself if you click on a link and find yourself lost.
short word of warning before we begin: Any book on the subject of security is likely to enlighten the reader on a variety of nasty things that could potentially happen to him or her.
It is our hope that by reading this book you will learn more about the world around you and gain valuable knowledge and understanding that will help you protect yourself, your privacy, and your information.
While these are probably not the best place to start learning about security, you may find them useful or interesting after you have become more familiar with some of the basics.
It is the scientific and practical approach to computation and its applications and the systematic study of the feasibility, structure, expression, and mechanization of the methodical procedures (or algorithms) that underlie the acquisition, representation, processing, storage, communication of, and access to, information.
Some fields, such as computational complexity theory (which explores the fundamental properties of computational and intractable problems), are highly abstract, while fields such as computer graphics emphasize real-world visual applications.
For example, programming language theory considers various approaches to the description of computation, while the study of computer programming itself investigates various aspects of the use of programming language and complex systems.
Although many initially believed it was impossible that computers themselves could actually be a scientific field of study, in the late fifties it gradually became accepted among the greater academic population.
Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society—in fact, along with electronics, it is a founding science of the current epoch of human history called the Information Age and a driver of the information revolution, seen as the third major leap in human technological progress after the Industrial Revolution (1750–1850 CE) and the Agricultural Revolution (8000–5000 BC).
For example, the study of computer hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often called information technology or information systems.
Early computer science was strongly influenced by the work of mathematicians such as Kurt Gödel, Alan Turing, Rózsa Péter and Alonzo Church and there continues to be a useful interchange of ideas between the two fields in areas such as mathematical logic, category theory, domain theory, and algebra.
The relationship between computer science and software engineering is a contentious issue, which is further muddied by disputes over what the term 'software engineering' means, and how computer science is defined.
David Parnas, taking a cue from the relationship between other engineering and science disciplines, has claimed that the principal focus of computer science is studying the properties of computation in general, while the principal focus of software engineering is the design of specific computations to achieve practical goals, making the two separate but complementary disciplines.
Eden described them as the 'rationalist paradigm' (which treats computer science as a branch of mathematics, which is prevalent in theoretical computer science, and mainly employs deductive reasoning), the 'technocratic paradigm' (which might be found in engineering approaches, most prominently in software engineering), and the 'scientific paradigm' (which approaches computer-related artifacts from the empirical perspective of natural sciences, identifiable in some branches of artificial intelligence).
As a discipline, computer science spans a range of topics from theoretical studies of algorithms and the limits of computation to the practical issues of implementing computing systems in hardware and software. CSAB,
formerly called Computing Sciences Accreditation Board—which is made up of representatives of the Association for Computing Machinery (ACM), and the IEEE Computer Society (IEEE CS)—identifies four areas that it considers crucial to the discipline of computer science: theory of computation, algorithms and data structures, programming methodology and languages, and computer elements and architecture.
In addition to these four areas, CSAB also identifies fields such as software engineering, artificial intelligence, computer networking and communication, database systems, parallel computation, distributed computation, human–computer interaction, computer graphics, operating systems, and numerical and symbolic computation as being important areas of computer science.
All studies related to mathematical, logic and formal concepts and methods could be considered as theoretical computer science, provided that the motivation is clearly drawn from the field of computing.
The second question is addressed by computational complexity theory, which studies the time and space costs associated with different approaches to solving a multitude of computational problems.
The use of formal methods for software and hardware design is motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analysis can contribute to the reliability and robustness of a design.
Formal methods are best described as the application of a fairly broad variety of theoretical computer science fundamentals, in particular logic calculi, formal languages, automata theory, and program semantics, but also type systems and algebraic data types to problems in software and hardware specification and verification.
The field often involves disciplines of computer engineering and electrical engineering, selecting and interconnecting hardware components to create computers that meet functional, performance, and cost goals.
Computer performance analysis is the study of work flowing through computers with the general goals of improving throughput, controlling response time, using resources efficiently, eliminating bottlenecks, and predicting performance under anticipated peak loads.
Computer security is a branch of computer technology with an objective of protecting information from unauthorized access, disruption, or modification while maintaining the accessibility and usability of the system for its intended users.
This branch of information science and computer science mainly focuses on the study of information processing, particularly with respect to system integration and human interactions with machine and data.
The study is connected to many other fields in computer science, including computer vision, image processing, and computational geometry, and is heavily applied in the fields of special effects and video games.
Scientific computing (or computational science) is the field of study concerned with constructing mathematical models and quantitative analysis techniques and using computers to analyze and solve scientific problems.
From its origins in cybernetics and in the Dartmouth Conference (1956), artificial intelligence research has been necessarily cross-disciplinary, drawing on areas of expertise such as applied mathematics, symbolic logic, semiotics, electrical engineering, philosophy of mind, neurophysiology, and social intelligence.
AI is associated in the popular mind with robotic development, but the main field of practical application has been as an embedded component in areas of software development, which require computational understanding.
The starting-point in the late 1940s was Alan Turing's question 'Can computers think?', and the question remains effectively unanswered, although the Turing test is still used to assess computer output on the scale of human intelligence.
But the automation of evaluative and predictive tasks has been increasingly successful as a substitute for human monitoring and intervention in domains of computer application involving complex real-world data.
One proposed explanation for this is the quick development of this relatively new field requires rapid review and distribution of results, a task better handled by conferences than by journals.
Fundamentals of Information Systems Security/Information Security and Risk Management
Information security means protecting information (data) and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction.
Since advancement is directly tied to how well you can convince others, who often fall outside of your of job duties and department, as to your higher value to the company as stated by your own effective written communication this leads to amazing resume writers and take no blame style of email responses that seems to definitely lead to the eventual failure of company's standards and actual knowledge.
It is often covered up by relationships which form at the power levels within any group of people and those who are considered so-called experts having no real idea what is really involved under the hood of the reports/applications they use and no proof presented in emails written when self declared claims of their expertise is made or blame is to be put on another.
Governance is the set of responsibilities and practices exercised by the board and executive management with the goal of providing strategic direction, ensuring that objectives are achieved, ascertaining that risks are managed appropriately and verifying that the enterprise's resources are used responsibly.
security policy is an overall general statement produced by senior management (or a selected policy board or committee) that dictates what role security plays within the organization.
The COSO framework defines internal control as a process, effected by an entity's board of directors, management and other personnel, designed to provide reasonable assurance regarding the achievement of objectives in the following categories:
COBIT provides managers, auditors, and IT users with a set of generally accepted measures, indicators, processes and best practices to assist them in maximizing the benefits derived through the use of information technology and developing appropriate IT governance and control in a company.
The series provides best practice recommendations on information security management, risks and controls within the context of an overall Information Security Management System (ISMS), similar in design to management systems for quality assurance (the ISO 9000 series) and environmental protection (the ISO 14000 series).
All organizations are encouraged to assess their information security risks, then implement appropriate information security controls according to their needs, using the guidance and suggestions where relevant.
Given the dynamic nature of information security, the ISMS concept incorporates continuous feedback and improvement activities, summarized by Deming's 'plan-do-check-act' approach, that seek to address changes in the threats, vulnerabilities or impacts of information security incidents.
ISO/IEC 27002 and ISO/IEC 27001 remain the most used standards, because they provide the most basic guidance for an enterprise information security program practices and processes and also because they are the most current versions of their popular predecessors (BS 7799 and ISO 17799).
***WARNING*** WARNING about SOD possible shortcomings ****** This approach can lead to a high level of difficulty when trying to determine what the underlying causes of errors or failures in large scale entity's production automation as no person will be able to view the information flow process from the 'big picture' and how an automated program starts an application that is not creating the correct output data but not clearly failing to an error message alert running on a Virtual Server client that transports the data file that is created to an outside client and etc.
Especially as each separated department individual will just glance at their application software used to manage their specified section on their monitor screen and seeing no obvious errors assume the unknown error causing complete system or process failure problem is not within their section and go back to the practice of effective communicating while writing all the great accomplishments they delivered that furthered the entity's stated goals to have available for their next review with management because that's what HR told them to do.
Without those few and far between expert level techs who can have (or get) the administration rights to view all aspects of any given production process it will be nearly impossible to determine the underlying cause and can lead to outrageous decisions as to what the problem must of been.
(For example: deciding to quit using all virtual servers and go back to multiple actual server machines with each connected to it's on monitor all because no error handling was encoded in the in-house written .net program.) (Or nobody realizing the automated software machine was running into RAM issues because every automated job was set to auto start at exactly 6:00 and MS Windows has a built in limit of a maximum of 10 network connections at one time even at the enterprise level and so forth.) ***These SOD positions are of no interest to those high level technical experts who seek to be constantly challenged.***
Introduction The principle of least privilege, also known as the principle of minimal privilege or just least privilege, requires that in a particular abstraction layer of a computing environment every module (such as a process, a user or a program on the basis of the layer we are considering) must be able to access only such information and resources that are necessary to its legitimate purpose.
The higher the trusted level/security clearance access or the higher level within an organization's hierarchy the larger the risk of more extreme damage in terms of costs and security while having the opposite effect in terms of the likelihood for getting caught in the criminal act decrease exponentially.
And even more concerning is the chance of actually having criminally charges brought against the once 'most trusted' but now criminal falls virtually to zero at the very top levels as those criminals will settle before charges get filed for a fraction of the amount stolen with no damage coming to their reputation whatsoever thus allowing them to maintain that 'most trusted' status.
This is clearly the most worrisome not just for those within an organization but for all people within a nation or union of nations bound by financial/economic trade agreements since they are based on trust and could lead to large scale wars between those nations.
While others would claim (and is actually stated in separate security section within this page above) that the data/systems/application owners should already be performing routine checks for damaged or lost data or compromised programs and applications.
These people ask would you rather have a benign intruder that found a way to penetrate your computer/network systems and also lets you know of the potential security flaw or have a criminal intruder penetrate your system with intent to steal possibly millions of dollars from you or your customers before you had any clue of the risk you were taking or the risks you were placing on your customers/clients/confidential and or proprietary data?
Anonymity Anonymity on the internet is sometimes discussed in the same context with questions of privacy on the internet, because anonymity can provide many of the same benefits as privacy.For example, if someone is using the internet to obtain medical or psychological counseling, or to discuss sensitive topics (for example, AIDS), anonymity can afford protection similar to that of privacy.
- On 25. september 2021
Map of Computer Science
The field of computer science summarised. Learn more at this video's sponsor Computer science is the subject that studies what ..
Cybersecurity: Crash Course Computer Science #31
Cybersecurity is a set of techniques to protect the secrecy, integrity, and availability of computer systems and data against threats. In today's episode, we're ...
Encryption Concepts - Information Security Lesson #6 of 12
Dr. Soper discusses encryption concepts. Topics covered include encryption algorithms, keyed and keyless cryptosystems, cryptanalysis, breaking encrypted ...
Computer Networks Lecture1,Introduction to Computer network and IP address
In this video I have discussed what is networking and details about IP addresses like how is classful addressing dong.
Early Computing: Crash Course Computer Science #1
Hello, world! Welcome to Crash Course Computer Science! So today, we're going to take a look at computing's origins, because even though our digital ...
How To Learn C#? (C# 101)
SUBSCRIBE TO THIS CHANNEL: vid.io/xokz Head First C#: C# 5.0 in a Nutshell - The Definitive Reference: ..
ICS Computer part 1,Ch 6,Security Violation-ICS/FSC Part 1- 11th Class
In this online lecture, Sir Abid Hussian explains Computer Science Chapter 6 Security,Copyright and the Law.The topic being discussed is Topic 6.2 Security ...
Introduction to OS
Intro to Game Theory and the Dominant Strategy Equilibrium
Game theory is the study of human behaviour in strategic settings. It is used to solve some of the harder problems in economics
Liebherr - Mobile and Crawler Cranes: 3 things an operator needs to consider
A Must for all crane operators: Strong wind and especially gusts can be really dangerous during a crane job. We want to raise awareness about this topic to ...