AI News, Surviving Data Science at the Speed of Hype
Surviving Data Science at the Speed of Hype
There is this idea endemic to the marketing of data science that big data analysis can happen quickly, supporting an innovative and rapidly changing company.
Once upon a time, I built a model for Dell that optimized the distribution of their chassis and monitors from China to their fulfillment centers in the U.S. Over and over again, my team worked oncustomizingour model to Dell’s supply chain.
good predictive model requires a stable set of inputs with a predictable range of values that won’t drift away from the training set.
Maybe customer support starts working in different shifts, maybe a new product gets released or prices are changed and that shifts demand from historical levels, or maybe your customer base changes to a younger demographic than your ML models have training data for targeting.
That if only you had the right tools then your analytics could stay ahead of the changing business in time for your data to inform the change rather than lag behind it.
Data modeling is a process used to define and analyze data requirements needed to support the business processes within the scope of corresponding information systems in organizations.
Therefore, the process of data modeling involves professional data modelers working closely with business stakeholders, as well as potential users of the information system.
The data requirements are initially recorded as a conceptual data model which is essentially a set of technology independent specifications about the data and is used to discuss initial requirements with the business stakeholders.
The last step in data modeling is transforming the logical data model to a physical data model that organizes the data into tables, and accounts for access, performance and storage details.
The use of data modeling standards is strongly recommended for all projects requiring a standard means of defining and analyzing data within an organization, e.g., using data modeling:
In the context of business process integration (see figure), data modeling complements business process modeling, and ultimately results in database generation.
However, the term 'database design' could also be used to apply to the overall process of designing, not just the base data structures, but also the forms and queries used as part of the overall database application within the Database Management System or DBMS.
Entity-relationship modeling is a relational schema database modeling method, used in software engineering to produce a type of conceptual data model (or semantic data model) of a system, often a relational database, and its requirements in a top-down fashion.
For example, a generic data model may define relation types such as a 'classification relation', being a binary relation between an individual thing and a kind of thing (a class) and a 'part-whole relation', being a binary relation between two things, one with the role of part, the other with the role of whole, regardless the kind of things that are related.
By standardization of an extensible list of relation types, a generic data model enables the expression of an unlimited number of kinds of facts and will approach the capabilities of natural languages.
Conventional data models, on the other hand, have a fixed and limited domain scope, because the instantiation (usage) of such a model only allows expressions of kinds of facts that are predefined in the model.
The logical data structure of a DBMS, whether hierarchical, network, or relational, cannot totally satisfy the requirements for a conceptual definition of data because it is limited in scope and biased toward the implementation strategy employed by the DBMS.
The overall goal of semantic data models is to capture more meaning of data by integrating relational concepts with more powerful abstraction concepts known from the Artificial Intelligence field.
The idea is to provide high level modeling primitives as integral part of a data model in order to facilitate the representation of real world situations.
In our extensive work with human services agencies, we’ve found it’s very common for clients to need to step back and revisit the organization’s program model.
Some results of this technique might be: Following is a list of the business process analysis best practices that we take in preparing our clients for case management and outcomes tracking systems.
We define a program model in these terms: A strategically defined set of services delivered to a specific target population by a distinct set of staff in order to achieve one or more intended outcomes or results.
A working program model is not an ideal that an existing agency makes up out of thin air: it comprises all of your business processes and the logic behind them (you may need to go further back to your logic model if it’s not clear why you are operating in certain ways).
You may wish to separately record minor tweaks to your process and slowly roll them in, but if you’re making larger changes, we recommend you not simply put them in the model and expect them to happen.
You may wish to review some tips from our technology culture post: steps 1, 2, and 5 in particular will be valuable for thinking about this change (substitute “technology” for any other process change), in terms of communicating with staff and considering all relevant perspectives.
This final stage of outcomes management connotes ongoing examination of whether your programs and services are achieving their desired results for your clients and helping you accomplish your mission.
Creating an outcomes-focused culture is a multi-faceted approach which starts with defining your program model and theory of change and the related data to measure results.
Defining your program model or models may take some time and energy investment up front, but when done well, it can speed your agency along a more strategic path that will pay dividends in terms of both understanding and improving your impact and client outcomes.
Business process modeling
Business process modeling (BPM) in business process management and systems engineering is the activity of representing processes of an enterprise, so that the current process may be analysed, improved, and automated.
With advances in software design, the vision of BPM models becoming fully executable (and capable of simulations and round-trip engineering) is coming closer to reality.
Techniques to model business process such as the flow chart, functional flow block diagram, control flow diagram, Gantt chart, PERT diagram, and IDEF have emerged since the beginning of the 20th century.
New methodologies include business process redesign, business process innovation, business process management, integrated business planning, among others, all 'aiming at improving processes across the traditional functions that comprise a company'.
In the field of software engineering, the term 'business process modelling' opposed the common software process modelling, aiming to focus more on the state of the practice during software development.
In that time (early 1990s) all existing and new modelling techniques to illustrate business processes were consolidated as 'business process modelling languages'.
Business process modelling became the base of new methodologies, for instance those that supported data collection, data flow analysis, process flow diagrams and reporting facilities.
The term 'business model' is thus used for a broad range of informal and formal descriptions to represent core aspects of a business, including purpose, offerings, strategies, infrastructure, organizational structures, trading practices, and operational processes and policies.
business process is a collection of related, structured activities or tasks that produce a specific service or product (serve a particular goal) for a particular customer or customers.
The artifact-centric business process model has emerged as a holistic approach for modelling business processes, as it provides a highly flexible solution to capture operational specifications of business processes.
It particularly focuses on describing the data of business processes, known as 'artifacts', by characterizing business-relevant data objects, their life-cycles, and related services.
Business process modelling tools provide business users with the ability to model their business processes, implement and execute those models, and refine the models based on as-executed data.
As a result, business process modelling tools can provide transparency into business processes, as well as the centralization of corporate business process models and execution metrics.
Business process modelling tools should not be confused with business process automation systems - both practices have modeling the process as the same initial step and the difference is that process automation gives you an ‘executable diagram’ and that is drastically different from traditional graphical business process modelling tools.
BPM suite software provides programming interfaces (web services, application program interfaces (APIs)) which allow enterprise applications to be built to leverage the BPM engine.
Other types of business reference model can also depict the relationship between the business processes, business functions, and the business area's business reference model.
Business models are developed as defining either the current state of the process, in which case the final product is called the 'as is' snapshot model, or a concept of what the process should become, resulting in a 'to be' model.
By comparing and contrasting 'as is' and 'to be' models the business analysts can determine if the existing business processes and information systems are sound and only need minor modifications, or if reengineering is required to correct problems or improve efficiency.
As organizations strive for attainment of their objectives, business process management attempts to continuously improve processes - the process to define, measure and improve your processes – a 'process optimization' process.
Why the Lean Start-Up Changes Everything
Launching a new enterprise—whether it’s a tech start-up, a small business, or an initiative within a large corporation—has always been a hit-or-miss proposition.
According to the decades-old formula, you write a business plan, pitch it to investors, assemble a team, introduce a product, and start selling as hard as you can.
It’s a methodology called the “lean start-up,” and it favors experimentation over elaborate planning, customer feedback over intuition, and iterative design over traditional “big design up front” development.
Although the methodology is just a few years old, its concepts—such as “minimum viable product” and “pivoting”—have quickly taken root in the start-up world, and business schools have already begun adapting their curricula to teach them.
In many ways it is roughly where the big data movement was five years ago—consisting mainly of a buzzword that’s not yet widely understood, whose implications companies are just beginning to grasp.
According to conventional wisdom, the first thing every founder must do is create a business plan—a static document that describes the size of an opportunity, the problem to be solved, and the solution that the new venture will provide.
The lean method has three key principles: First, rather than engaging in months of planning and research, entrepreneurs accept that all they have on day one is a series of untested hypotheses—basically, good guesses.
They go out and ask potential users, purchasers, and partners for feedback on all elements of the business model, including product features, pricing, distribution channels, and affordable customer acquisition strategies.
Then, using customers’ input to revise their assumptions, they start the cycle over again, testing redesigned offerings and making further small adjustments (iterations) or more substantive ones (pivots) to ideas that aren’t working.
Unlike typical yearlong product development cycles that presuppose knowledge of customers’ problems and product needs, agile development eliminates wasted time and resources by developing the product iteratively and incrementally.
During the dot-com boom, start-ups often operated in “stealth mode” (to avoid alerting potential competitors to a market opportunity), exposing prototypes to customers only during highly orchestrated “beta” tests.
The lean start-up methodology makes those concepts obsolete because it holds that in most industries customer feedback matters more than secrecy and that constant feedback yields better results than cadenced unveilings.
(I’ve been involved with eight high-tech start-ups, as either a founder or an early employee.) When I shifted into teaching, a decade ago, I came up with the formula for customer development described earlier.
Eric quickly recognized that waterfall development, the tech industry’s traditional, linear product development approach, should be replaced by iterative agile techniques.
He also saw similarities between this emerging set of start-up disciplines and the Toyota Production System, which had become known as “lean manufacturing.” Eric dubbed the combination of customer development and agile practices the “lean start-up.” The tools were popularized by a series of successful books.
In 2003, I wrote The Four Steps to the Epiphany, articulating for the first time that start-ups were not smaller versions of large companies and laying out the customer development process in detail.
But on the basis of what I’ve seen at hundreds of start-ups, at programs that teach lean principles, and at established companies that practice them, I can make a more important claim: Using lean methods across a portfolio of start-ups will result in fewer failures than using traditional methods.
Employment growth in the 21st century will have to come from new ventures, so we all have a vested interest in fostering an environment that helps them succeed, grow, and hire more workers.
The structure of the venture capital industry, in which a small number of firms each needed to invest big sums in a handful of start-ups to have a chance at significant returns.
(This is less an issue in Europe and other parts of the world, but even overseas there are geographic entrepreneurial hot spots.) The lean approach reduces the first two constraints by helping new ventures launch products that customers actually want, far more quickly and cheaply than traditional methods, and the third by making start-ups less risky.
Indeed, it’s become quite common to see young tech companies that practice the lean start-up methodology offer software products that are simply “bits” delivered over the web or hardware that’s built in China within weeks of being formed.
If the entire universe of small business embraced them, I strongly suspect it would increase growth and efficiency, and have a direct and immediate impact on GDP and employment.
For years they taught students to apply large-company approaches—such as accounting methods for tracking revenue and cash flow, and organizational theories about managing—to start-ups.
As business schools embrace the distinction between management execution and searching for a business model, they’re abandoning the business plan as the template for entrepreneurial education.
Instead of preparing to build a factory, scale up production, and launch the new offering (ultimately named Durathon) as a traditional product extension, Logan applied lean techniques.
These weren’t sales calls: The team members left their PowerPoint slides behind and listened to customers’ issues and frustrations with the battery status quo.
According to press reports, demand for the new batteries is so high that GE is already running a backlog of orders.The first hundred years of management education focused on building strategies and tools that formalized execution and efficiency for existing businesses.
- On Monday, August 19, 2019
How to Develop As-Is and To-Be Business Process
You always look for improvements on your business. As your business growth, more resources and effort will be needed, including new hardware resources, ...
Learn MS VISIO 2016 to Make Professional Diagrams like Business Process Models & more
Did you knew MS VISIO is - Used by 12+ million users across the globe - Used to create a world-class Professional Quality Diagrams and Drawings. Learning ...
Process Improvement: Six Sigma & Kaizen Methodologies
Improve your project processes with these top two methodologies: Six Sigma & Kaizen Try our award-winning PM software for free: ...
The new digital business | Richard Heaslip | TEDxOxbridge
Richard speaks about how technology is changing the way we do business and the business environment in a highly digital economy. Talk transcript coming ...
Let’s Talk Big Data in manufacturing: transforming the industry and operations
How can we turn Big Data into Smart Data? What business models are needed? Should our data be open or ..
What is a Data Model?
Why is a Data Model so important? What is a packaged Data Model? How does a Data Model fit into a Data Warehousing project? This video addresses these ...
Digital Transformation - The Business World of Tomorrow
Smart business networks, sensors, the Internet of Things, or teamwork which is distributed solely as virtual assignments – many of these innovations will result in ...
06. Generating Insight for Schematic Models
Reference Links: 1. Basic workflow to "Generate Insight" ...
Process improvement Maturity models
A brief introduction to maturity models. In many domains an organization changes their focus as they gain experience in that domain - they mature. Their maturity ...
John Mullins, London Business School - Better Business Models: The Process