AI News, Portal:Artificial intelligence

Portal:Artificial intelligence

We realize that the ultimate performance of a system will depend heavily on the task domain that it is situated in, and this motivates our preference for studying activity (behaviour) rather than thought (rationale).

Although the third approach, (known as cognitive modelling), is of great importance to cognitive scientists, we concern ourselves with the fourth approach.

Of the four, this approach allows to consider the performance of a theoretical system that yields the behaviour optimally suited to achieve its goals, given the information available to it.

We suggest the use of the learning project template (use 'subst:Learning project boilerplate' on the new page, inside the double curved brackets {{}}).

Portal:Artificial intelligence

We realize that the ultimate performance of a system will depend heavily on the task domain that it is situated in, and this motivates our preference for studying activity (behaviour) rather than thought (rationale).

Although the third approach, (known as cognitive modelling), is of great importance to cognitive scientists, we concern ourselves with the fourth approach.

Of the four, this approach allows to consider the performance of a theoretical system that yields the behaviour optimally suited to achieve its goals, given the information available to it.

We suggest the use of the learning project template (use 'subst:Learning project boilerplate' on the new page, inside the double curved brackets {{}}).

Introduction to Psychology/Introduction

Psychology is an academic and applied discipline involving the scientific study of mental processes and behavior.

Psychology also refers to the application of such knowledge to various spheres of human activity, including relating to individuals' daily lives and the treatment of mental illness.

Psychology differs from the other social sciences — anthropology, economics, political science, and sociology — in that psychology seeks to explain the mental processes and behavior of individuals.

Whereas biology and neuroscience study the biological or neural processes and how they relate to the mental effects they subjectively produce, psychology is primarily concerned with the interaction of mental processes and behavior on a systemic level.

Psychology also refers to the application of such knowledge to various spheres of human activity, including problems of individuals' daily lives and the treatment of mental illness.

The year 1879 is commonly seen as the start of psychology as an independent field of study, because in that year German scientist Wilhelm Wundt founded the first laboratory dedicated exclusively to psychological research in Leipzig, Germany.

Evolutionary psychology is a theoretical approach to psychology that attempts to explain certain mental and psychological traits—such as memory, perception, or language as evolved adaptations, i.e., as the functional products of natural or sexual selection.

Wundt argued that 'we learn little about our minds from casual, haphazard self-observation...It is essential that observations be made by trained observers under carefully specified conditions for the purpose of answering a well-defined question.'

At the turn of 19th century the founding father of experimental psychology Wilhelm Wundt tried to experimentally confirm his hypothesis that conscious mental life can be broken down into fundamental elements which then form more complex mental structures.

Wundt's structuralism was quickly abandoned because it could not be tested in the same way as behavior, until now, when the brain-scanning technology can identify, for example, specialized brain cells that respond exclusively to basic lines and shapes and are then combined in subsequent brain areas where more complex visual structures are formed.

This covers a broad range of research domains, examining questions about the workings of memory, attention, perception, knowledge representation, reasoning, creativity and problem solving.

The first use of the term 'psychology' is often attributed to the Ligma, scholastic philosopher Rudolf Goeckel (Latinized Rudolph Goclenius), published in 1590.[1] More than six decades earlier, however, the Croatian humanist Marko Marulić used the term in the title of a work which was subsequently lost.[2] This, of course, may not have been the very first usage, but it is the earliest documented use at present.

Partly in reaction to the subjective and introspective nature of Freudian psychology, and its focus on the recollection of childhood experiences, during the early decades of the 20th century behaviorism gained popularity as a guiding psychological theory.

In his paper 'Psychology as the Behaviorist Views It' (1913), Watson argued that psychology 'is a purely objective [emphasis added] experimental branch of natural science,' that 'introspection forms no essential part of its methods', and that 'the behaviorist recognizes no dividing line between man and brute.'

Behaviorism reigned as the dominant model in psychology through the first half of the 20th century, largely due to the creation of conditioning theories as scientific models of human behavior, and their successful application in the workplace and in fields such as advertising.

Chomsky demonstrated that language could not purely be learned from conditioning, as people could produce sentences unique in structure and meaning that couldn't possibly be generated solely through experience of natural language, implying that there must be internal states of mind that behaviorism rejected as illusory.

The humanistic approach has its roots in existentialist and phenomenological philosophy and many humanist psychologists completely reject a scientific approach, arguing that trying to turn human experience into measurements strips it of all meaning and relevance to lived existence.

Some of the founding theorists behind this school of thought were Abraham Maslow who formulated a hierarchy of human needs, Carl Rogers who created and developed client-centered therapy, and Fritz Perls who helped create and develop Gestalt therapy.

Links between brain and nervous system function were also becoming common, partly due to the experimental work of people such as Charles Sherrington and Donald Hebb, and partly due to studies of people with brain injury (see cognitive neuropsychology).

With the increasing involvement of other disciplines (such as philosophy, computer science and neuroscience) in the quest to understand the mind, the umbrella discipline of cognitive science has been created as a means of focusing such efforts in a constructive way.

Psycholinguistics/Theories and Models of Language Acquisition

Language acquisition is the process by which humans acquire the capacity to perceive, produce and use words to understand and communicate.

Language development is a complex and unique human quality but yet children seem to acquire language at a very rapid rate with most children's speech being relatively grammatical by age three (Crain &

Most children in a linguistic community seem to succeed in converging on a grammatical system equivalent to everyone else in the community with few wrong turns, which is quite remarkable considering the pitfalls and complexity of the system.

Children acquire language in stages and different children reach various stages at different times, although they have one thing in common and that is that typically developing children learning the same language will follow an almost identical pattern in the sequence of stages they go through.

B.F Skinner's Verbal Behaviour (1957) applied a functional analysis approach to analyze language behaviour in terms of their natural occurrence in response to environmental circumstances and the effects they have on human interactions.[3]

According to the behaviourist theory, language learning is a process of habit formation that involves a period of trial and error where the child tries and fails to use correct language until it succeeds.

Before long children will take on the imitation or modeling component of Skinner's theory of language acquisition in which children learn to speak by copying the utterances heard around them and by having their responses strengthened by the repetitions, corrections and other reactions that adults provide.

It seems that the human species has evolved a brain whose neural circuits contain linguistic information at birth and this natural predisposition to learn language is triggered by hearing speech.

Chomsky has determined that being biologically prepared to acquire language regardless of setting is due to the child's language acquisition device (LAD), which is used as a mechanism for working out the rules of language.

Chomsky believed that all human languages share common principles, such as all languages have verbs and nouns, and it was the child's task to establish how the specific language she or he hears expresses these underlying principles.

Children under the age of three usually don't speak in full sentences and instead say things like 'want cookie' but yet you would still not hear them say things like 'want my' or 'I cookie' because statements like this would break the syntactic structure of the phrase, a component of universal grammar.

It was termed the critical period hypothesis and since then there has been a few case examples of individuals being subject to such circumstances such as the girl known as Genie who was imposed to an abusive environment, which didn't allow her to develop language skills.

Language is only one of the many human mental or cognitive activities and many cognitivists believe that language emerges within the context of other general cognitive abilities like memory, attention and problem solving because it is a part of their broader intellectual development.

Piaget's cognitive theory states that, children's language reflects the development of their logical thinking and reasoning skills in stages, with each period having a specific name and age reference.[9]

It stresses the importance of the environment and culture in which the language is being learned during early childhood development because this social interaction is what first provides the child with the means of making sense of their own behaviour and how they think about the surrounding world.

Vygotsky also developed the concepts of private speech which is when children must speak to themselves in a self guiding and directing way- initially out loud and later internally and the zone of proximal development which refers to the tasks a child is unable to complete alone but is able to complete with the assistance of an adult.

The utterances of the mother and father during the activities are ritualized and predictable so that the child is gradually moved to an active position where they take over the movements of the care-taker and eventually the ritualized language as well.

Another influential researcher of the interaction theory is Jerome Bruner who elaborated and revised the details of the theory over a number of years and also introduced the term Language Acquisition Support System (LASS), which refers to the child`s immediate adult entourage but in the fuller sense points to the child`s culture as a whole in which they are born.

Adults adapt their behaviour towards children to construct a protected world in which the child is gradually inclined to take part in a growing number of scenarios and scripts and in this way the child is lead gradually further and further into language.

However, one must remember that although our social context provides support for language acquisition, it does not directly provide the knowledge that is necessary to acquire language and this perhaps where a child`s innate abilities come into play.

The usage-based theory of language suggests that children initially build up their language through very concrete constructions based around individual words or frames on the basis of the speech they hear and use.

The usage-based theory takes constructions, which are direct form meaning pairings, to be the basic units of grammar and believe that children learn constructions by first mastering specific instances before going on to generalize and use the constructions productively with other lexical items.

Constructions gradually become more general and more abstract during the third and fourth years of life and grammar emerges as the speakers of a language create linguistic constructions out of recurring sequences of symbols (Tomasello, 2003).

According to Doughty and Long (2003), token frequency is how often in the input particular words or specific phrases appear and type frequency counts how many different lexical items a certain pattern or construction is applicable to.

Type frequency determines productivity because high type frequency ensures that a construction is used frequently, thus strengthening its representational schema and making it more accessible for further use with new items.

Another term coined in the usage-based theory is pre-emption which is an anti-frequency mechanism that suggests that children who experiences a verb in a rare construction this will cause the child to avoid using that verb in a more common structure.

In optimality theory, the essence of both language learning in general (learnability) and language acquisition (actual development children go through) entails the rankings of constraints from an initial state of the grammar to the language specific ranking of the target grammar (McCarthy, 2004)[17].

OT is a development of generative grammar, a theory sharing the quest for universal principles such as universal grammar but differs from the theory proposed by Chomsky because optimality theory believes that these universal constraints are violable (Kager,1999)[18].

Langendoen (1997) these constraints include constraints governing aspects of phonology, such as syllabification constraints, constraints governing morphology and constraints that determine the correct syntactic properties of a language.

Another term coined by the optimality theory is markedness, which refers to the continuum that language-universal and language-specific properties rest on, with completely unmarked properties being those found in virtually all languages and extremely marked properties being found quite rarely.

How infants accomplish this task has become the focus of debate especially for Patricia Kuhl who has developed the Native Language Magnet Model to help explain how infants at birth can hear all the phonetic distinctions used in the world's languages.

The idea that more than selection is involved in development phonetic perception has been clearly demonstrated by experimental findings showing that native language phonetic perception shows a significant improvement between 6 and 12 months of age.

Previous studies had shown native language improvement after 12 months of age and before adulthood but newer studies such as Kuhl's and colleagues has gone beyond selection in explaining developmental change in infants' perception of speech.

Better native phonetic perception at 7 months of age predicted accelerated language development at between 14 and 30 months whereas better non-native performance at 7 months predicted slower language development at 14 and 30 months.

Results supported the view that the ability to discriminate non-native phonetic contrasts reflects the degree to which the brain remains in the initial state, open and uncommitted to native language speech patterns.[20]

In my observation, generally, individual learns language through a combination of innate and cognitive abilities ( Innateness l cognitive theory ) and external factors (optimality theory).

my observation, a child will always have innate capability to learn language, the amount of exposure a child gets ( from environment ) will determine the extent of the usefulness of language a child acquired.

The theories do have one thing in common though, and that is the fact that they all believe that language acquisition is the key aspect that distinguishes humans from other organisms and by understanding how different aspects of language are acquired we can better understand the main vehicle by which we communicate.

Introduction to Computer Information Systems/Information Systems

A system is a group of procedures and different elements that work together in order to complete a task.

Some information systems are meant to be used by all levels of employees while others are specifically designed to handle the needs of employees with certain responsibilities.

At the ground level, employees generally make job-related decisions that are based on 'on-the-job' input without having to consider how those decisions will affect other departments or employees in other positions.

These decisions are usually aimed at a farther sighted goal than those of Operational managers and often need more intelligence pulled from data systems in order to reach these objectives.

Middle managers might be more concerned with how to improve yearly gains and may use systems that will deliver more detailed information about specific locations of factories or retailers in certain states.

Even though there are many systems, the four that will be elaborated are the following: transaction processing systems, customer relationship management systems, business intelligence systems, and knowledge management systems.

This system helps businesses keep record of customer activities, purchasing trends, product defects, and customer inquiries.

Although they may seem simple because we use them everyday, Document Processing and Document/Content Management Systems can be very complicated when taken to a larger scale because it includes not only the organization and creation of a database, but also ensuring the security of the documents within the system.[3]

Three types of widely used accounting systems are accounts payable systems, accounts receivable systems, and general ledger systems.

Accounts payable systems keep track of how much a seller owes a buyer, while accounts receivable systems keep track of how much a buyer owes a consumer.

An enterprise system is an integrated information system that is made to support business processes, information flows, reporting, and data analytics in complex organizations.

Some of these application processes may include sales and distribution, financial accounting, investment management, materials management, production planning, maintenance, and human resources.

An example would be a university or college that uses an enterprise system to manage all student records, enrollment applications and acceptance, finances, human resources, etc.[7]

Many companies are starting to implement enterprise systems because it is an easy way to combine the core functions of the company with technological advancements.

It is an easy way because the enterprise system is a single software architecture that fuses all the core processes of a business to function as one unit.

The synchronized functioning of the processes makes it easier and more efficient to for multiple departments to work together and it is also helpful for managers as they can better oversee multiple tasks and project at one time.[8]

It just refers to any 'process of analyzing data from different perspectives and summarizing it into useful information'—in other words, taking a lot of data about anything, including public information, and analyzing it with software to a useful end that can't easily be reached by a human alone.

For example, supermarkets regularly have computers analyze massive amounts of data on which items are more or less frequently purchased in which locations so that they can stock stores with items that will be purchased by more individuals in that store's location.

They also might change the prices of items slightly on certain days when those items are more commonly purchased, and they stock items close to one another that are often purchased together.

There are many other uses of data mining besides just these (which are examples that have actually occurred, not just hypothetical ones), but in general, data mining is most frequently used via corporations to cut costs or increase revenues.[10]

Because prediction is the main goal, predictive data mining is the most common type of data mining, with popular and practical business application.

Begins with data preparation which may involve the cleaning and transformation of data, selecting subsets of records, or performing preliminary feature selection operations (to bring the number of variables or fields to a manageable range).

It also may involve simple, straightforward predictors for a regression model, in order to identify the most relevant factors and determine the complexity, and/or a general nature of models.

Involves considering various models and choosing the best one based on their predictive performance (offering stable results across samples).

Many techniques (Bagging, Boosting, Stacking, and Meta-Learning) developed to achieve this are based on so-called 'competitive evaluation of models,' which uses different models on the same data, analyzing their performance, and choosing the best.

Using the model selected as best in the previous stage and applying it to new data in order to generate predictions or estimates of the expected outcome.[11]

CAM is used to control machine tools and related machinery in the manufacturing of work pieces.CAM can also assist in all operations of a manufacturing plant, including planning, management, transportation and storage.

Compared to manual machines, there are several advantages to using CAM such as speed (CAM is faster because machining speeds are higher), greater accuracy, greater consistency (every finished product is the same), efficiency (production can run 24 hours a day, 7 days a week) and sophistication (CAM is able to machine difficult shapes, eg tracks on a circuit boards).[15]

You may not realize it, but whenever you fly on an airline, a massive amount of data has to go through a series of programs and locations and be approved before your flight can occur.

At a center, flight data information such as weather, weight, passenger information, and gate availability are all put together and interpreted to make a safe flight.

With constant advancement in technology pertaining to artificial intelligence, one would be well advised to seek out the possible effects of these “life like” computers.

Ray Kurzwell, author, inventor and futurist, proposes the idea that artificial intelligence, genetics, nanotechnology and robotics will soon result in a human-machine civilization.

He believes that in the not so near future, due to advancements in genetics that will allow for scientists to reprogram genes to eliminate disease and curb the aging process, man and machine will merge, “allowing one to transcend biological mortality.” Mr. Kurzwell may be a bit ahead of his time with his ideas and theories, but at the rate technology progresses in this age, it is hard to predict the heights it will reach[18].

Reinforcement learning is a process in which a computer works to answer a question or solve a problem and then associates the positive outcome of solving the problem with the actions it took to solve it.

Generative adversarial networks are systems consisting of one network that generates new data after learning from a training set, and another network that tries to discriminate between real and fake data producing realistic synthetic data.

Real-world uses for this could be to make video game scenery, to de-blur pixelated video footage, or to apply stylistic changes to computer-generated designs[19].

IBM Watson, developed by a research team led by David Ferrucci, is a question answering computer system capable of answering questions posed in natural language.

Cognitive computing, with still no official definition, refers to hardware/software that mimics the functioning of the human brain and helps to improve human decision-making.

Business analysts make a strategic plan, look at the business model analysis, process design the organization's work, and then interpret for technical systems.

These steps are as follows: preliminary investigation, system analysis, system design, system acquisition, system implementation, and system maintenance.

The main point of doing a preliminary investigation is to determine what problems need to be fixed and what is the best way to go about solving those problems, if solutions do in fact exist.

This second step, system analysis, is used to investigate the problem on a larger scale and fine tune all the information a company has on the issue.

Use case diagrams are used to describe the behavior of the target system from an external point of view, while also illustrating the users who interact with the system.

As mentioned above, system analysis is the phase of system development where the problem area is fully studied in depth and the needs of system users are assessed.

The tools that will help accomplish this phase of collecting data and data analysis are entity-relationship diagrams (ERDs), data flow diagrams (DFDs), decision tables and decision trees, business process modeling notation (BPMN), and class diagrams and use case diagrams.

To describe the use of these tools in depth you will need to understand that any tools or processes used during this phase will aid in understanding the problems or issues of the current systems and how to improve them.

In addition to a data dictionary, the systems analyst also has to create input designs to help illustrate the input screens and other user interfaces that will be used to input data into the new system.

To ensure that the data is input accurately and secured against data loss, it is essential for the system design to contain some form of a security feature.

Also, an output design helps identify the specific outputs required to meet the information requirements, select methods required for presenting that information, and design reports, or other documents that carry the information.

Lastly, once the new system has finally been designed, a cost-benefit analysis is performed to determine whether the expected benefits (tangible or intangible benefits) of the new system are worth the expected cost.

The company, in order to make the most profit, should evaluate each bid and figure out which one has charged the lowest price while also reaching the necessary criteria for the company’s system.

Some determinants used for the test include examining for the amount of workload that a system is capable of processing, the capability of solving complex scientific problems using a range of computations, offering legitimate data for the system to process and viewing the performance and scalability of the software, and many more.

Sometimes benchmarks are not capable of being performed due to a company’s location or accessibility, but for the most part they are a great way to assist in evaluating which bid is the best.[37]

There are four ways of converting data to new a system: direct conversion- the old system is deactivated and the new one is implemented right away;

System maintenance includes modifying existing software or adding completely new features to the existing software, as well as fixing any glitches or bugs and checking security.

Once a major change is determined to be the best option for a software, an organization must go through the system development life cycle again to replace the old system from scratch.

However, the exact sequence and tasks performed during each phase, and the names and number of the phases, may vary depending on the organization and the type of system being developed.

For example, smaller systems in smaller companies may skip or condense some activities, while other development projects may go back and repeat a previous step to refine the process before moving on.

The analyst focuses on three basic elements: the output that must be provided by the system, the source data, or input that the user will provide to the system, the processing needed to produce the output, given the input.

The fourth phase is system acquisition financial institutions should ensure that systems are developed, acquired, and maintained with appropriate security controls.

This leads us to the last step which is system implementation, in this phase, the production system is installed, initial user training is completed, user documentation is delivered, and the post implementation review meeting is held.

Installation the biggest aspect is that the entire system is planned and built and built before anyone gets to use it or test it, so every aspect to every phase is essential to the traditional approach for system development.[42][43]

With that, the iterative approach acts as a response to the traditional development cycle, which is more likely to have “higher software costs and poor estimates of time and cost” due to the expense of changing a finished product.

As opposed to the iterative or traditional approach, which both focus on professional users, the end-user development approach is focused solely on configuring the development of the system and is often done using tools or programs.

Instead of having to be highly educated and a professional in the area of software or programming, someone trying to develop a simplistic and easy system can use these programming tools and develop something of their own.

This is usually used in small businesses, tasks, or daily projects, and is not something that an intricate business would ever use to run their day-to-day software programs.

support system: A type of information system typically used by upper management that provides people with the tools and capabilities to organize and analyze their decision making information. enterprise

information system: An information system that combines geographic information with other types of data (such as information about customers, sales, and so forth) in order to provide a better understanding of the relationships among the data. intelligent

agent: A program that performs specific tasks to help make a user’s work environment more efficient or entertaining and that typically modifies its behavior based on the user’s actions. management

lifecycle management (PLM) system: A system designed to manage a product as it moves through the various stages of its life cycle, from design to retirement. robot:

Organizational behavior

Organizational Behavior is a field of study that investigates the impact that individuals,groups and structure have on behavior within organizations, for the purpose of applying such knowledge toward improving an organization’s effectiveness.

An organization is a collection of people who work together to achieve a wide variety of goals, both goals of the various individuals in the organization and goals of the organization as a whole.

A second reason is to learn how to apply these concepts, theories, and techniques to improve behavior in organizations so that individuals, groups, and organizations can achieve their goals.

Macro-OB generally includes the study of organizations with a focus on structure, technology, organizational change, organizational learning, culture, decision-making, innovation and creativity, and so on.

Micro-OB on the other hand tends to focus more on inidividuals, groups/teams, and interpersonal issues such as motivation, personality, leadership, ethics, job design, power, politics, conflict, negotiations, etc.

one want to improve the behaviour of an individual on group we have to look into the psychlogical needs of an individual and group.So the knowledge of psychology can realy help in improving and modifying the behaviour of individual and group.If the psychological needs are fulfilled ,it gives satisfiction to people and also give peace of mind, which can improve the ability of an organization

provides the base for collective living and relationship while social institutes provide the base for better form and shape for society among its different organizations.

deal with the fulfilment of social needs in a phycological sense of interpretations.it cause different socio psycho conditions and affair for the modification of OB.it deal with the inner self of an individual itself .

social phycology is the main and principal organ of bringing together different organ in a better harmony and this improve the OB and modified it towards further improvement and achevement.

also define anthrapology as 'science of human beings especialy of their environment and social relations and there culture' environment play a pivotal role in the iimprovment and modification of OB.

Jimmy Wales: How a ragtag band created Wikipedia

Jimmy Wales recalls how he assembled "a ragtag band of volunteers," gave them tools for collaborating and created Wikipedia, the ..

Rolling release

In software development, a rolling release or rolling update development model refers to a continually developing software system; this is instead of a standard ...

Action research

Action research is either research initiated to solve an immediate problem or a reflective process of progressive problem solving led by individuals working with ...