AI News, Month: January 2015

Month: January 2015

Data storytelling has become a regular topic at data science conferences, and with good cause.

First: The story is what gives meaning to the data, leads to people understanding our analysis, and supports the discussion of our findings, but second: Our interpretation of the data is at least to some extend arbitrary and subjective, and no harm is done to admit that.

Compared however to stories without any data support, data-driven narratives have a far better chance to maintain their statement.

useful step-by-step way to get meaning into data by gradually abstracting was proposed by Pei et.al.: “Human Behavior Cognition Using Smartphone Sensors“, Sensors 2013, 13, 1402-1424;

we have distances or speed measurements, the numbers won’t tell us, if metric or imperial scale is applicable.

shows grade and change in altitude.Most fitness trackers do this in their dashboards by showing our training efforts in the context of the situation they could easily match with it.

We won’t tell a plausible story, if we don’t embed it in the panorama that our audience would expect us to experience, if they would have lived through the story in person.

This is when data science finally gets married to classic social research: The questionnaire based interview –

Using the framework method for the analysis of qualitative data in multi-disciplinary health research

The Framework Method sits within a broad family of analysis methods often termed thematic analysis or qualitative content analysis.

These approaches identify commonalities and differences in qualitative data, before focusing on relationships between different parts of the data, thereby seeking to draw descriptive and/or explanatory conclusions clustered around themes.

While in-depth analyses of key themes can take place across the whole data set, the views of each research participant remain connected to other aspects of their account within the matrix so that the context of the individual’s views is not lost.

It is therefore useful where multiple researchers are working on a project, particularly in multi-disciplinary research teams were not all members have experience of qualitative data analysis, and for managing large data sets where obtaining a holistic, descriptive overview of the entire data set is desirable.

The Framework Method is most commonly used for the thematic analysis of semi-structured interview transcripts, which is what we focus on in this article, although it could, in principle, be adapted for other types of textual data [13], including documents, such as meeting minutes or diaries [12], or field notes from observations [10].

Although the Framework Method is a highly systematic method of categorizing and organizing what may seem like unwieldy qualitative data, it is not a panacea for problematic issues commonly associated with qualitative data analysis such as how to make analytic choices and make interpretive strategies visible and auditable.

It is therefore essential that studies using the Framework Method for analysis are overseen by an experienced qualitative researcher, though this does not preclude those new to qualitative research from contributing to the analysis as part of a wider research team.

would require a more inductive approach that allows for the unexpected, and permits more socially-located responses [25] from interviewees that may include matters of cultural beliefs, habits of food preparation, concepts of ‘fate’, or links to other important events in their lives, such as grief, which cannot be predicted by the researcher in advance (e.g.

It is not within the scope of this paper to consider study design or data collection in any depth, but before moving on to describe the Framework Method analysis process, it is worth taking a step back to consider briefly what needs to happen before analysis begins.

As any form of qualitative or quantitative analysis is not a purely technical process, but influenced by the characteristics of the researchers and their disciplinary paradigms, critical reflection throughout the research process is paramount, including in the design of the study, the construction or collection of data, and the analysis.

They cannot be too attached to certainty, but must remain flexible and adaptive throughout the research in order to generate rich and nuanced findings that embrace and explain the complexity of real social life and can be applied to complex social issues.

It is important to remember when using the Framework Method that, unlike quantitative research where data collection and data analysis are strictly sequential and mutually exclusive stages of the research process, in qualitative analysis there is, to a greater or lesser extent depending on the project, ongoing interplay between data collection, analysis, and theory development.

Qualitative research

In the conventional view of statisticians, qualitative methods produce explanations only of the particular cases studied (e.g., as part of an ethnography of a newly implemented government program), any more general conclusions are considered tentative propositions (informed assertions).[citation needed]

In contrast, a qualitative researcher might argue that understanding of a phenomenon or situation or event, comes from exploring the totality of the situation (e.g., phenomenology, symbolic interactionism), often with access to large amounts of 'hard data' of a nonnumerical form.

popular method of qualitative research is the case study (Stake 1995[9], Yin 1989[10]), which examines in depth 'purposive samples' to better understand a phenomenon (e.g., support to families;

the case study method exemplifies the qualitative researchers' preference for depth, detail, and context, often working with smaller and more focused samples, compared with the large samples of primary interest to statistical researchers seeking general laws[3].

These methods may be used alongside quantitative methods, scholarly or lay reviews of the literature, interviews with experts, and computer simulation, as part of multimethod attitude to data collection and analysis (called Triangulation)[3].

In particular, one could argue that qualitative researchers often reject natural science models of truth, prefer inductive, hypothesis-generating research processes and procedures (over hypothesis-testing models), are oriented towards investigations of meaning(s) rather than behaviour, and prefer data in the form of words and images, that are ideally naturally derived (e.g.

Babbie, a sociologist, documents that qualitative methods have been used for several centuries, but anthropologists brought qualitative field research methods to the forefront through their 19th century observations of preliterate societies.

As Robert Bogdan and Sari Biklen describe in their education text, 'historians of qualitative research have never, for instance, included Freud or Piaget as developers of the qualitative approach, yet both relied on case studies, observations and indepth interviewing'.[16]

However, this history is not apolitical, as this has ushered in a politics of 'evidence' (e.g., evidence-based practices in health and human services) and what can count as 'scientific' research in scholarship, a current, ongoing debate in the academy.

development and practice, narratology, storytelling, transcript poetry, classical ethnography, state or governmental studies, research and service demonstrations, focus groups, case studies, participant observation, qualitative review of statistics in order to predict future happenings, or shadowing, among many others.

Other sources include focus groups, observation (without a predefined theory like statistical theory in mind for example), reflective field notes, texts, pictures, photographs and other images, interactions and practice captured on audio or video recordings, public (e.g.

involves a moderator facilitating a small group discussion between selected individuals on a particular topic, with video and handscribed data recorded, and is useful in a coordinated research approach studying phenomenon in diverse ways in different environments with distinct stakeholders often excluded from traditional processes.

The research then must be 'written up' into a report, book chapter, journal paper, thesis or dissertation, using descriptions, quotes from participants, charts and tables to demonstrate the trustworthiness of the study findings.

From the experimental perspective, its major stages of research (data collection, data analysis, discussion of the data in context of the literature, and drawing conclusions) should be each undertaken once (or at most a small number of times) in a research study.

In qualitative research however, all of the four stages above may be undertaken repeatedly until one or more specific stopping conditions are met, reflecting a nonstatic attitude to the planning and design of research activities.

An example of this dynamicism might be when the qualitative researcher unexpectedly changes their research focus or design midway through a research study, based on their 1st interim data analysis, and then makes further unplanned changes again based on a 2nd interim data analysis;

Qualitative researchers would argue that their recursivity in developing the relevant evidence and reasoning, enables the researcher to be more open to unexpected results, more open to the potential of building new constructs, and the possibility of integrating them with the explanations developed continuously throughout a study.[3]

For example, an interpretivist researcher might believe in the existence of an objective reality 'out there', but argue that the social and educational reality we act on the basis of, never allows a single human subject to direct access the reality 'out there' in reality (this is a view shared by constructivist philosophies[disambiguation needed]).

When coding is complete, the analyst may prepare reports via a mix of: summarizing the prevalence of codes, discussing similarities and differences in related codes across distinct original sources/contexts, or comparing the relationship between one or more codes.

Because qualitative analyses are often more inductive than the hypothesis testing nature of most quantitative research, the existing 'theoretical sensitivity' (i.e., familiarity with established theories in the field) of the analyst becomes a more pressing concern in producing an acceptable analysis.

the variety, richness, and individual characteristics of the qualitative data is argued to be largely omitted from such data coding processes, rendering the original collection of qualitative data somewhat pointless.

To defend against the criticism of too much subjective variability in the categories and relationships identified from data, qualitative analysts respond by thoroughly articulating their definitions of codes and linking those codes soundly to the underlying data, thereby preserving some of the richness that might be absent from a mere list of codes, whilst satisfying the need for repeatable procedure held by experimentally oriented researchers.

Some data analysis techniques, often referred to as the tedious, hard work of research studies similar to field notes, rely on using computers to scan and reduce large sets of qualitative data.

Another scenario is when the chief value of a dataset is the extent to which it contains 'red flags' (e.g., searching for reports of certain adverse events within a lengthy journal dataset from patients in a clinical trial) or 'green flags' (e.g., searching for mentions of your brand in positive reviews of marketplace products).

Analysts respond by proving the value of their methods relative to either a) hiring and training a human team to analyze the data or b) by letting the data go untouched, leaving any actionable nuggets undiscovered;

Each of the paradigms listed by Guba and Lincoln are characterized by axiomatic differences in axiology, intended action/impact of research, control of research process/outcomes, relationship to foundations of truth and knowledge, validity and trust (see below), textual representation and voice of the researcher and research participants, and commensurability with other paradigms.

This 'narrative turn' is producing an enormous literature as researchers present sensitizing concepts and perspectives that bear especially on narrative practice, which centers on the circumstances and communicative actions of storytelling.

There are many different ways of establishing trustworthiness, including: member check, interviewer corroboration, peer debriefing, prolonged engagement, negative case analysis, auditability, confirmability, bracketing, and balance.

the 1980s and 1990s, the new qualitative research journals became more multidisciplinary in focus moving beyond qualitative research’s traditional disciplinary roots of anthropology, sociology, and philosophy.[49]

In the late 1980s to 1990s, early academic articles emerged beginning the transformation from institutional studies (e.g., Taylor's 'Let them eat programs') to studies of community, community services and community life reviewed and cited in professional journals.[50][51]

First steps in qualitative data analysis: transcribing

Qualitative research can explore the complexity and meaning of social phenomena,1,2 for example patients' experiences of illness3 and the meanings of apparently irrational behaviour such as unsafe sex.4 Data for qualitative study may comprise written texts (e.g.

In contrast, formulating a patient's medical history with statements such as ‘she reports a pain in the left leg’ or ‘she denies alcohol use’ frames the patient's account as less trustworthy than the doctor's observations.10 The aims of a project and methodological assumptions have implications for the form and content of transcripts since different features of data will be of analytic interest.7 Making recordings involves reducing the original data, for example, selecting particular periods of time and/or particular camera angles.

Verbal and non-verbal interaction together shape communicative meaning.11 The aims of the project should dictate whether visual information is necessary for data interpretation, for example, room layout, body orientation, facial expression, gesture and the use of equipment in consultation.12 However, visual data are more difficult to process since they take a huge length of time to transcribe, and there are fewer conventions for how to represent visual elements on a transcript.5 The meanings of utterances are profoundly shaped by the way in which something is said in addition to what is said.13,14 Transcriptions need to be very detailed to capture features of talk such as emphasis, speed, tone of voice, timing and pauses but these elements can be crucial for interpreting data.7 The following example shows how the addition of pauses, laughter and body conduct to a transcript invites a different interpretation of an exchange between doctor and patient.

Representing (some) non-verbal features of the interaction on the transcript changes the interpretation of this two-line interaction (see Appendix, transcription conventions):Dr 9: (..) I would suggest (..) yes paracetamol or ibuprofen is a good (..) symptomatic treatment (..) um (.) (slapping hands on thighs) and you'll be finePt K: fine (..) okay (.) well (..) (shrugging shoulders and laughing) thank you very much Dr 9: (..) I would suggest (..) yes paracetamol or ibuprofen is a good (..) symptomatic treatment (..) um (.) (slapping hands on thighs) and you'll be fine Pt K: fine (..) okay (.) well (..) (shrugging shoulders and laughing) thank you very much In the second representation of this interaction, both speakers pause frequently.

The doctor addresses this accountability directly in her next turn:Dr 5: I must ask you (.) why have you come in today because it is a Saturday morning (1.0) it's for urgent cases only that really have just startedPt F: Yes because it has been troubling me since last last night (left hand still on neck) Dr 5: I must ask you (.) why have you come in today because it is a Saturday morning (1.0) it's for urgent cases only that really have just started Pt F: Yes because it has been troubling me since last last night (left hand still on neck) This more detailed level of transcribing facilitates analysis of the social relationship between doctor and patient;

Interaction is also understood in wider context such as understanding questions and responses to be part of an ‘interview’ or ‘consultation’ genre with particular expectations for speaker roles and the form and content of talk.19 For example, the question ‘how are you?’ from a patient in consultation would be interpreted as a social greeting, while the same question from a doctor would be taken as an invitation to recount medical problems.14 Contextual information about the research helps the transcriber to interpret recordings (if they are not the person who collected the data), for example, details about the project aims, the setting and participants and interview topic guides if relevant.

For example, ‘hwaryuhh’ is much more easily read and understood if represented as separate words, with punctuation and capital letters, as ‘How are you?’.20 Choosing to use the grammar and spelling conventions of standard UK written English aids readability, but at the same time irons out the linguistic variety which is an important feature of cultural and subcultural identity.20 For example, the following extract represents a patient speaking a Cockney English dialect (typically spoken by working class Londoners), in consultation with a doctor speaking English with Received Pronunciation (typically spoken by educated, middle class English people):Dr 1: so what are your symptoms since yesterday (..) the achesPt B: aches ere (..) in me arm (..) sneezing (..) edacheDr 1: ummm (..) okay (..) and have you tried anything for this (.) at all?Pt B: no (..) I ain't a believer of me- (.) medicine to tell you the truth Dr 1: so what are your symptoms since yesterday (..) the aches Pt B: aches ere (..) in me arm (..) sneezing (..) edache Dr 1: ummm (..) okay (..) and have you tried anything for this (.) at all?

There is debate about what counts as relevant context in qualitative research.21,22 For example, studies usually describe the setting in which data were collected and demographic features of respondents such as their age and gender, but relevant contextual information could also include historical, political and policy context, participants’ physical appearance, recent news events, details of previous meetings and so on.23 Authors’ decisions on which data and what contextual information to present will lead to different framing of data.

Data visualization

Data visualization or data visualisation is viewed by many disciplines as a modern equivalent of visual communication.

Users may have particular analytical tasks, such as making comparisons or understanding causality, and the design principle of the graphic (i.e., showing comparisons or showing causality) follows the task.

Tables are generally used where users will look up a specific measurement, while charts of various types are used to show patterns or relationships in the data for one or more variables.

Data visualization refers to the techniques used to communicate data or information by encoding it as visual objects (e.g., points, lines or bars) contained in graphics.

To convey ideas effectively, both aesthetic form and functionality need to go hand in hand, providing insights into a rather sparse and complex data set by communicating its key-aspects in a more intuitive way.

Yet designers often fail to achieve a balance between form and function, creating gorgeous data visualizations which fail to serve their main purpose — to communicate information'.[6]

The line width illustrates a comparison (size of the army at points in time) while the temperature axis suggests a cause of the change in army size.

Author Stephen Few described eight types of quantitative messages that users may attempt to understand or communicate from a set of data and the associated graphs used to help communicate the message:

For example, it may require significant time and effort ('attentive processing') to identify the number of times the digit '5' appears in a series of numbers;

For example, since humans can more easily process differences in line length than surface area, it may be more effective to use a bar chart (which takes advantage of line length to show comparison) rather than pie charts (which use surface area to show comparison).[14]

Such maps can be categorized as Thematic Cartography, which is a type of data visualization that presents and communicates specific data and information through a geographical illustration designed to show a particular theme connected with a specific geographic area.

Earliest documented forms of data visualization were various thematic maps from different cultures and ideograms and hieroglyphs that provided and allowed interpretation of information illustrated.

The idea of coordinates was used by ancient Egyptian surveyors in laying out towns, earthly and heavenly positions were located by something akin to latitude and longitude at least by 200 BC, and the map projection of a spherical earth into latitude and longitude by Claudius Ptolemy [c.85–c.

Figure shows a graph from the 10th, possibly 11th century that is intended to be an illustration of the planetary movement, used in an appendix of a textbook in monastery schools.[23]

By the 16th century, techniques and instruments for precise observation and measurement of physical quantities, and geographic and celestial position were well-developed (for example, a “wall quadrant” constructed by Tycho Brahe [1546–1601], covering an entire wall in his observatory).

French philosopher and mathematician René Descartes and Pierre de Fermat developed analytic geometry and two-dimensional coordinate system which heavily influenced the practical methods of displaying and calculating values.

Private schools have also developed programs to meet the demand for learning data visualization and associated programming libraries, including free programs like The Data Incubator or paid programs like General Assembly.[25]

Data presentation architecture (DPA) is a skill-set that seeks to identify, locate, manipulate, format and present data in such a way as to optimally communicate meaning and proper knowledge.

Data presentation architecture weds the science of numbers, data and statistics in discovering valuable information from data and making it usable, relevant and actionable with the arts of data visualization, communications, organizational psychology and change management in order to provide business intelligence solutions with the data scope, delivery timing, format and visualizations that will most effectively support and drive operational, tactical and strategic behaviour toward understood business (or organizational) goals.

Often confused with data visualization, data presentation architecture is a much broader skill set that includes determining what data on what schedule and in what exact format is to be presented, not just the best way to present data that has already been chosen.

Biblical Series I: Introduction to the Idea of God

Lecture I in my Psychological Significance of the Biblical Stories series from May 16th at Isabel Bader Theatre in Toronto. In this lecture, I describe what I ...

Google I/O 2009 - Transactions Across Datacenters..

Google I/O 2009 - Transactions Across Datacenters (and Other Weekend Projects) Ryan Barrett -- Contents -- 0:55 - Background quotes 2:30 - Introduction: ...

The Weeknd - D.D.

THE MADNESS FALL TOUR 2015:

229th Knowledge Seekers Workshop June 21, 2018

This weekly on-going public series of Knowledge Seekers Workshops brings us new teachings, universal knowledge and new understandings of true space ...

The Microsoft AI platform - GS07

Join Joseph Sirosh, Corporate Vice President of the Cloud AI Platform, as he dives deep into the latest additions to the Microsoft AI platform and capabilities.

Lecture - 1 Introduction to Software Engineering

Lecture Series on Software Engineering by Prof.N.L. Sarda, Prof. Umesh Bellur,Prof.R.K.Joshi and Prof.Shashi Kelkar, Department of Computer Science ...

ZEITGEIST: MOVING FORWARD | OFFICIAL RELEASE | 2011

Please support Peter Joseph's new, upcoming film project: "InterReflections" by joining the mailing list and helping: LIKE ..

NW-NLP 2018: Ben Taskar Invited Talk; Learning and Reasoning about the World using Language

The fifth Pacific Northwest Regional Natural Language Processing Workshop will be held on Friday, April 27, 2018, in Redmond, WA. We accepted abstracts ...

Resilience Potential of Coral Reefs in the Mariana Islands

This webinar was conducted as part of the "Climate Change Science and Management Webinar Series" held in partnership ..

Disrupting Democracy: Panel 1 - Internationales Sommerfestival 2017

Disrupting Democracy - Themenschwerpunkt beim Internationalen Sommerfestival am 19.08.2017 Panel 1: Disrupting Democracy MIT/WITH Richard Gutjahr ...