jump to navigation

Five Steps to Insights March 8, 2017

Posted by stewsutton in Big Data, business analytics, business intelligence, Collaboration, Computational Knowledge, Information Technology, Knowledge Management.
add a comment

There are some very simple steps (five in this example) that can be taken as a curious person with the right tools seeks to understand what the data has to tell them.  Generally it starts with a simple question. You want to know something. This sends you on a quest to gather some data. Often this gather process is quite time consuming. In some scientific endeavors, the gathering is the tedious process of “recording” the data that you observe in your experiment.  This may take many days, weeks, or sometimes years.  Then comes the next step and that is the preparation of that data for exploration. The way data is recorded and gathered is seldom the structure needed for reporting.  The data must be transformed and reshaped.  This is not changing the values in the data, but rather it is molding the way the data is organized so that it can be explored with data visualization tools.  At this point I believe the fun part begins. This is the moment where you may begin to explore the data.  Exploration is a very good label for what happens at this point.  You are navigating and observing what is there and you are seeing things for the first time.  This process is exciting and it often brings you insights and understanding that you can share with others.

Computational Knowledge February 4, 2014

Posted by stewsutton in Architecture, Big Data, business intelligence, Collaboration, Computational Knowledge, Economics, Education, Knowledge Management.
add a comment

Right now we have a serious need for more students to fall in love with all of the STEM subjects, which fall into the categories of science, technology, engineering, and mathematics. We know these fields fuel economic growth, so training a STEM workforce has been recognized as a key goal in education policy. And yet, there is an enthusiasm gap in these subject areas, and nowhere is that more evident than math. In the United States, students don’t think they’re good at math, so they become quite adapt at hatting it. Many students it seems would rather eat broccoli than do math homework (and that is within a culture raised on fast-food where the concept of broccoli is viewed as utterly disgusting). Not surprisingly, these students are significantly underperforming. So how do we change this?

The way we teach math needs to be reinvented!

In a nutshell, “students need visual and interactive curriculum that ties into real life.” Nowhere is the power of how good mathematical instruction better demonstrated than within the environment of Wolfram Mathematica.

Properly teaching math breaks math down into four components:

1. Posing the right questions
2. Turning a real world problem into a math formulation
3. Computation
4. Turning a math formulation back to the real world, verifying it.

We spend perhaps 80 percent of the time in math education teaching people to do #3 (computation by hand) — This is the one step that computers can do better than any human after years of practice. Why are we doing this?

Instead, let us use computers to calculate. After all, that’s the math chore we hate the most. It may have been necessary to teach this skill 50 years ago. There are certainly a few practical examples of how hand-calculation can be useful today.

The goal of the Wolfram technology is to collect and curate all objective data; implement every known model, method, and algorithm; and make it possible to compute whatever can be computed about anything. We see this technology achieving some pretty spectacular levels of performance in Wolfram|Alpha and within Mathematica as well. Integrating this form of computational knowledge within classrooms is going to have a powerful multiplying effect on student performance and understanding as they orient themselves to solving real-life problems with the power of computational knowledge.

Big Data July 13, 2013

Posted by stewsutton in Architecture, Big Data, business analytics, Cloud, Cloud Computing, Information Technology, Knowledge Management.
add a comment

Perhaps you have heard of the term “big data.” Well does it seem to be rising atop the curve of inflated expectations? It is probably a healthy perspective to be just a bit suspicious of “big data” solutions coming to the rescue where all others have been unsuccessful.

There are certainly examples where scientists compare approaches to problem solving and this includes conversations about big data. Big problems need solutions that can operate at “big” scale, and the phenomenon of big data is certainly real. The three Vs of volume, velocity and variety, coined by the Gartner Group, have helped us to frame the characteristics of what we understand as big data.

Ultimately it is how these “problems” get solved by using distributed data and distributed processing. Some will do things “internally” while others will take to the cloud. But as many have already experienced, some of the “cloud benefits” (related to “bursty” allocation against resource) are not there for “big data” configurations.

Said more simply, the benefits of lightly touching the cloud resources and getting the financial benefit of this time-sharing is diminished for big data problems that keep the resources fully utilized and thereby incur the highest order of expense against the cloud infrastructure. This reality affects how we must architect solutions that cross into the cloud and make use of “heavy lifting” within our own corporate intranet infrastructure. It keeps the “big data” problem interesting for sure.

With all of that being said, it’s quite another thing when you start to hear how big data is going to upend everything. It is quite unlikely that big data will usher in a “revolution” to transform how we live, work, and think. We do well to approach the topic of big data as just a new tool in the toolkit and use it for those problems where it makes sense.

Realities of the New Work Environment April 15, 2013

Posted by stewsutton in business analytics, business intelligence, Cloud, Cloud Computing, Collaboration, Communications, Community, Data Portability, Economics, Information Policy, Information Technology, Knowledge Management, Software.
add a comment

Trends such as globalization, economic change, externalization, and consumerization are creating  new realities in the modern information workplace.  Here are four workplace realities that are already having an effect on the way we get things done.

1. Greater Interdependence – Employees collaborate many more individuals in their day-to-day work than they did just a decade ago (typically ten or more). As a result, nearly one-half of an employee’s impact on business unit profitability comes from network performance—the ability to help others perform and be helped by others. In contrast, in 2002, nearly 80% of an employee’s impact came from individual task performance. Although network performance is vital, only 20% of employees are effective at it. The way IT supports enterprise collaboration must change as IT adopts techniques to understand and support the needs of teams and individuals.

2. Frequent Organizational Change – Clearly organizations have never stood still.  However, a majority of employees feel that the rate of change is accelerating. Since 2010, the average employee has experienced major changes including:  reorganizations, strategy revisions, or new leadership, at a cycle of roughly every seven months. This state of near continuous change shortens business partner time horizons and puts a premium on responsive IT planning and budgeting. It also undermines efforts to encapsulate business process in enterprise systems and increases the value of integration.

3. Greater Knowledge Intensity – Ah, the Knowledge Management stuff…  An increasing percentage of employees (over 80%) are conducting knowledge work that requires analysis and judgment. Knowledge work is becoming ubiquitous because of transaction automation and the emergence of “big data,” In addition, business volatility means that even when transactions remain manual, there are plenty of exceptions that require analysis and judgment to resolve. Information technology investments are already changing to reflect this trend, with more money being spent on analytics and collaboration and less on process automation.

4. More Technology Choice – It is commonly reported that a serious majority (nearly two-thirds) of employees use personal devices for work purposes.  This is huge!   However, this transition to device consumerization is only the starting point. After BYOD comes BYOI, BYON, and BYOA; bring your own information, networks, and applications. Almost one-half of all employees already use external, unofficial information sources for work purposes,  about a quarter of employees source their own collaboration and networking tools, and a fifth of employees use their own analytic tools. Although BYO has risks, it cannot be stopped. Managed correctly, it can provide faster access to new capabilities and a better fit with individual employee workflows.

New Roles within Enterprise IT April 13, 2013

Posted by stewsutton in Architecture, business analytics, business intelligence, Cloud, Cloud Computing, Collaboration, Communications, Education, Information Technology, Knowledge Management, Software.
add a comment

Talent within information technology is adapting to new roles as the work environment changes over the next five years (2013 to 2017).

  1. Collaboration and Social Media Evangelist – Responsible for understanding drivers of collaborative behavior and creating, managing, and developing a collaboration and social media strategy.   The person is likely to have a background in business, marketing, communications, or behavioral science, such as anthropology, organizational behavior, or psychology; more likely to be found in a consultant or other specialized role than in a corporation.  Their job responsibilities include:
    • Analyzing user behavior to understand workflows and collaboration needs
    • Establishing collaboration and social media strategy
    • Encouraging adoption of relevant collaboration and social media tools and techniques
    • Advocating for adoption of collaboration tools
    • Creating and delivering end-user awareness and training programs
    • Establishing collaboration and social media usage policies and procedures
  2. Information Insight Enabler – This role helps the IT organization drive employee productivity and help with equipping employees with competencies and not just tools.  They are responsible for supporting business unit heads, service managers, and knowledge workers with insight, business intelligence, and management reports for effective decision making.  They are likely to have a background in market or financial research or in analytics or statistics.  Their unique key responsibilities include:
    • Understanding the decision-making process and the workflows of business unit heads and service manager
    • Identifying knowledge worker’s information needs
    • Representing information in a user-friendly manner
    • Identifying trends and patterns; generates insight for business units and senior leadership
    • Developing frameworks and processes to analyze unstructured information
    • Performing market and customer research and analysis, and creating dashboards and scorecards
  3. Cloud Integration Specialist – This role assimilates cloud services—for both Applications and Infrastructure—into the existing IT environment.  They have experience in developing, deploying, and maintaining integration solutions; most likely to come from EAI or middleware implementation background, such as EAI/Integration developer.  Key responsibilities for this role include:
    • Collaborating with business unit leaders, service managers, and technology brokers to evaluate new cloud service offerings and determine integration needs
    • Coordinating with enterprise and information architects to ensure new cloud services align with technology roadmap
    • Working closely with business process analyst to ensure integration activities improve business processes
    • Coordinating testing efforts to identify and resolve any cross-functional integration issues
  4. User Experience Guru – This role collaborates with service managers and end users to understand and improve user experience and workflow for new and existing applications.   They are likely to have a specialist background in behavioral science, graphic design, or product design; more likely to be found in a consultant or other specialized role than in a corporation.  They will design and configure user-centric interfaces for in-house and cloud applications, allowing end users to access, visualize, and navigate information and analytics with ease.  Some of their key responsibilities include:
    • Analyzing business and functional requirements
    • Creating user-centered design
    • Improving the user experience
    • Visualizing and presenting information in a user- friendly manner to end users
  5. Technology Broker – This role is responsible for managing spend with all providers in a given category, such as Infrastructure or Applications.  They are likely to have a background in sales or business development at a technology service provider; alternatively, may have a procurement background or extensive experience managing programs that relied on external providers for delivery.  They will introduce new technologies and vendors to business units, the services group, and the remaining IT organization.  Their key roles include:
    • Understanding business needs and translates those into technology capabilities
    • Identifying new and existing technology offerings available in the market or in house
    • Negotiating contracts and manages relationships with multiple vendors for a category of IT spend
    • Creating and maintaining a catalog of technology services
    • Defining service level agreements to monitor vendor performance
  6. End-to-End IT Service Manager – End-to-End IT Services Packages all the technologies, processes, and resources across IT needed to deliver a specific business outcome while hiding technical complexity.  They are responsible for defining and delivering end-to- end IT services and is the primary owner of one or more services.  They are likely to have experience in IT service delivery, direct business engagement, technology sales and marketing, and financial plan development; more likely to be sourced from account manager or business relationship manager, solutions manager, architects, or infrastructure manager roles.  Responsibilities for this role include:
    • Collaboration with IT–business relationship managers to develop the end-to-end IT services strategy
    • Developing annual IT service delivery plan and negotiates delivery expectations with business partners
    • Providing information to business partners about service improvement opportunities and collaborates with them to drive down business costs and effectively support business capabilities
    • Guiding the service review process to drive continuous improvement efforts for services

Knowledge Management Updated February 1, 2013

Posted by stewsutton in business analytics, business intelligence, Collaboration, Communications, Knowledge Management, Software.
Tags: , , , ,
add a comment

Knowledge Management has undergone a significant change of emphasis over the past several years. We have moved beyond the days when KM was centered in the design and deployment of content management solutions, the fanfare around launching communities of practice, and the practical benefits of sharing lessons learned. The emphasis today is a complete reformulation of how we can extract value from information. The new intersecting themes generally include “big data”, “business analytics / business intelligence”, “social”, and “mobility”.

Leading up to this new arrangement of priorities, KM bounced around in the past several years seeking to find itself as the rapid technology changes in mobile platforms and across consumer-based social platforms took center stage. This caused the KM emphasis to drift toward “collaboration” things – after all isn’t sharing knowledge through collaboration what its all about? Community models for knowledge stewardship have fallen out of favor while crowdsourcing of answers has increase in popularity. The practical difficulty is that running a business, engineering improvements to complex operational procedures, and designing and manufacturing of precision equipment cannot be directly guided by the wisdom of any crowd.

And while the phrase knowledge management has often been met with resistance “I don’t what someone managing my knowledge…”, the new KM emphasis has oriented itself to offering value to the business and to the individual creators and managers of knowledge across the workforce. Consider for a moment this simple ten-step framework that illustrates the intersecting themes of knowledge and information management today…

1. Identify a relevant “structured” data source associated with our business
2. Repeat #1 multiple times, not really knowing (just yet) the intersecting relevance
3. Prepare the data sources identified in #2 so that they may be “accessed” and integrated
4. Connect results of #3 by using software to analyze and discover “features” in the data
5. Overlay social graphs, geo-mapping, and other information sources to illuminate findings
6. Clarify and contextualize these findings, draw conclusions, and propose business changes
7. Implement business changes and monitor #5 to assess the impact of business changes
8. Make necessary adjustments based on #7, and return to #1 to discover more relationships

Using tablet-based computers will expand the scope and impact of business intelligence throughout business.

Using tablet-based dashboards will expand the scope and impact of business intelligence and business analytics for both strategic and tactical value

Knowledge Management is becoming a more refined and mature information science. The tools for business are enabling broad and sophisticated analysis of data and application of business analytics by individuals that have line-of-business accountability. From middle management to executives, a new landscape of configurable mobile dashboards that encourage experimentation and insight building are emerging quickly. Big data is being gathered everywhere. The interface methods are being dramatically simplified. And mobile (tablet based) interfaces are being quickly formed, refined, delivered, modified, and shared among co-workers at various levels. Knowledge workers today are more skilled within business intelligence and business analytics.

It is a very interesting time.

Irena Sendler July 2, 2012

Posted by stewsutton in Knowledge Management.
add a comment

Two things to keep in mind:  

The world hasn’t just become wicked… and….  

The prize doesn’t always go to the most deserving.

Irena Sendler

There recently was a death of a 98 year-old lady named Irena. 

 During WWII, Irena, got permission to work in the Warsaw ghetto, as a Plumbing/Sewer specialist. 

 She had an ‘ulterior motive’. 

 She KNEW what the Nazi’s plans were for the Jews (being German). 

 Irena smuggled infants out in the bottom of the tool box she carried and she carried in the back of her truck a burlap sack, (for larger kids). 

 She also had a dog in the back that she trained to bark when the Nazi soldiers let her in and out of the ghetto. 

 The soldiers of course wanted nothing to do with the dog and the barking covered the kids/infants noises. During her time of doing this, she managed to smuggle out and save 2500 kids/infants.   


She was caught, and the Nazi’s broke both her legs, arms and beat her severely.  Irena kept a record of the names of all the kids she smuggled out and kept them in a glass jar, buried under a tree in her back yard.  


 After the war, she tried to locate any parents that may have survived it and reunited the family. 

 Most had been gassed. Those kids she helped got placed into foster family homes or adopted. 

Last year Irena was up for the Nobel Peace Prize. 

 She was not selected.   President Obama won one year before becoming President for his work as a community organizer for ACORN  

Al Gore won also — for a slide show on Global Warming. 



 Please read the little cartoon carefully, it’s powerful. 

 Then read the comments at the end.


It is now more than 60 years after the Second World War in Europe ended. 

 Now, more than ever, with Iran , and others, claiming the HOLOCAUST to be ‘a myth’. 

 It’s imperative to make sure the world never forgets, because there are others who would like to do it again. 

The 21st Century Cyber-Economy June 1, 2012

Posted by stewsutton in Economics, Information Policy, Information Technology, Knowledge Management, Politics, Security.
add a comment

Disclaimer:  This post is composed based on a review of publicly available information and a reflective commentary on events widely reported within the press.  No “inside” information or perspective has colored or enhanced the commentary presented here, and these opinions are solely and completely associated with the author.

As a technologist with an interest in preserving knowledge and improving the way we work with information, I am distracted by the subject of cyber warfare.  Its all over the news and hard to ignore if you have access to the Internet, TV, or national newspapers.  According to a report in the New York Times, President Obama accelerated a plan to cripple Iran’s uranium enrichment program by hacking into the program’s home base, Natanz. Although the program began in 2006, the Times reports that Obama pushed increasingly aggressive attacks, even in his early days in office.

The New York Times writer David E. Sanger claims that the U.S. developed a worm with Israel called Stuxnet.  This is a very sophisticated virus – so advanced that Symantec and other security experts could not figure out who made it. The Stuxnet worm crippled 20% of the Iranian centrifuges at Natanz.  Analysts and critics are still debating whether Stuxnet seriously impeded Iran’s nuclear ambitions or just slowed them down a bit.  However, nobody is denying the serious implications this tactic has for modern warfare.

This TED Talk video from March 2011 describes how when first discovered in 2010, the Stuxnet computer worm posed a baffling puzzle. Beyond its sophistication at the time, there loomed a more troubling mystery: its purpose. Ralph Langner,  a German control system security consultant. and his team helped crack the code that revealed this digital warhead’s final target. In a fascinating look inside cyber-forensics, he explains how — and makes a bold (and, it turns out, correct) guess at its shocking origins.

The Stuxnet cyber attack represents a new formally established method of warfare.  And more recently, the equally insidious Flame virus is capturing the attention of cyber-watchers. There is still speculation on who originated this specific virus.  But the frequency of cyber-actions appears to be on the uptick.

The following references are links to press sites where the ownership and origin of Stuxnet and other viruses within this emerging Cyberwar are being declared.

  1. Slashgear
  2. Washington Post
  3. PC Mag
  4. Ars Technica
  5. Reddit
  6. Telegraph
  7. Extreme Tech
  8. New York Times
  9. Tech Crunch
  10. Yahoo News

With all of this making big news today June 1, 2012, we have some interesting counter-dialog that emerged in a publication on May 31, 2012 from Adam P. Liff, a Doctoral Candidate in the Department of Politics at Princeton University.  The title of the article that was published in the Journal of Strategic Studies is: Cyberwar: A New ‘Absolute Weapon’? The Proliferation of Cyberwarfare Capabilities and Interstate War.  Within the article the central objective is to explore the implications of the proliferation of cyberwarfare capabilities for the character and frequency of interstate war.  The contrarian view expanded on within the paper is that cyberwarfare capabilities may actually decrease the likelihood of war.

In one hypothesis, computer network attacks (CNA) represent a low-cost yet potentially devastating asymmetric weapon.  The hypothesis is that asymmetric warfare will increase the frequency of war by increasing the probability of war between weak and strong states that would otherwise not fight due to the disparity between conventional military strength.

A second hypothesis put forward by Liff is that the plausible deniability and difficulty in attributing cyberatacks could lead potential attackers to be less fearful of retaliation, and thereby use CNA where they would not dare attack with conventional weapons.  As it took a while for the disclosure of the Stuxnet virus to be attributed, it is also likely that the time required to attribute an attack will accelerate.  Its all a matter of computer algorithms, processing power, and effective application of cyber-forensics.  Those with the bigger computers, better algorithms, and smarter scientists have a decided edge in conducting an effective investigation of the cyber war scene of the crime.

Another hypothesis put forward by Liff is that the difficulty of defending against cyberattacks will render states exceedingly vulnerable to surprise attacks.  And since state will not be able to afford to attack first, the offensive advantage of CNA may increase the frequency of preemptive war.  At the present time however it seems that for a few more years, the cyberwarfare domain will concede an advantage to actors that have considerably more resources; thereby offering an offensive advantage to those actors.

A summary conclusion offered by Liff is that in some situations CNA as an asymmetric weapon may actually decrease the frequency of war by offering relatively weak states an effective deterrent against belligerent adversaries.  While this opinion is interesting, it seems unlikely since the ability for weak states to guard their secrets may directly affect the confidence in another states success of a preemptive cyberattack.

So with all of this “lead-up”, we can consider the implications of this “cyber stuff” and how it affects our day-to-day information economy and the way we work.  To provide further foundation to this discussion, we need to acknowledge the current information ecosystem that is increasingly prevalent within our modern economy and the relationship of several major components within that ecosystem.  First let us simply describe the information ecosystem:

  1. Wireless access points
  2. Persistent network connections
  3. Data and applications in the cloud

To provide a more visual example of these three elements of our modern information ecosystem in action consider the person using an iPhone/iPad (access point) over a WiFi in a coffee shop (persistent network connection) and updating their Facebook or Twitter account with new information (data and applications in the cloud).  These actions and examples go far beyond the simple consumer visualization here, and are increasingly the “mix” for business applications.  We are rapidly moving from desktop computers to laptops to tablets and for the majority of information transactions in the future, the mobility will extend into wearable computing devices like fancy eyeglasses (Google Glasses).  This increased mobility is irreversible and the supporting technology that supports and speeds this mobility will win in the marketplace and it will further establish fully mobile computing as the way we transact with information.  The difficulty however is that the increased mobility introduces an expanded reliance on networks and big data/app centers where all the digital stuff we work with is stored and served up.

So while we had the “big 3” automakers back in the 60’s and 70’s, we now have the “big 3” digital service providers of the 21st century in the form of Amazon, Apple, and Google.  You have got your iPhone, iPad, Kindle, or Android device accessing an increasing portfolio of content and applications from the “big 3” digital service providers (think Kindle Books, Apple iTunes Music, and Google Drive for your documents, and other digital data).

So while the cyber threat is spoken about in relationship to national infrastructure and associated information assets, the threat to basic day-to-day personal information systems is also a reality.  Now it would not be a national security incident for people to lose access to their music on iTunes, or their picture albums on Flickr, or their collection of novels from the Kindle store, it is the increasing ubiquity within which we traverse this collective set of services that can be forecasted as a more complete reliance on similar arrangements for everything we do within the modern information economy.  Digital wallets in our mobile phones, online banking from wherever we are, controlling access to systems within our homes, and even the education system with increasing emphasis toward online learning and instruction.  It is a pretty safe bet that the information ecosystem will be pervasive and all information will need to successfully traverse this ecosystem.

Our daily lives (both business and personal) will become increasingly dependent on the reliability and performance of this information ecosystem.  It is a relatively simple inference to conclude that the information ecosystem will be under constant attack in the “cyber domain” and that will affect a more complex relationship between government-based protection of our “cyber border” in a manner not that far removed from how we protect our physical border.  Will this have a profound and lasting effect on the way we interact with our information?  Certainly it will.  There will be less anonymity within information transactions (since strong tie-ins with verifiable identity are foundational to improved security.  So you will be leaving digital “breadcrumbs” of digital transactions wherever you go within the cyber domain.  How much of this digital history will need to be publicly accessible, and how much will remain private will be the subject of much dialog and innovation, and services, and likely government policy.  It is going to be, no, it already is a brave new world and the digital cyberwar will just be another aspect to the 21st century way of life.

Jane Lynch, Eric Schmidt, and the Digital Gang May 31, 2011

Posted by stewsutton in Knowledge Management.
add a comment

Tuesday Opening at the Digital Tech Fest…

D9TuesdayIntro.mp3 Listen on Posterous



System Dynamics December 16, 2010

Posted by stewsutton in Information Policy, Information Technology, Knowledge Management.

So the field of System Dynamics is about 50 years old*. And while it has been around at least as long as I have been wandering the earth, I only recently connected to the power and potential of this discipline and how it can offer an important way to critically evaluate complex systems. Started around 1961, the field developed initially from the work of Jay W. Forrester. His seminal book Industrial Dynamics (Forrester 1961) is still a significant statement of philosophy and methodology in the field. Within ten years of its publication, the span of applications grew significantly.

So what is system dynamics and how can you define its approach?

  • Defining problems dynamically, in terms of graphs over time.
  • Striving for an endogenous, behavioral view of the significant dynamics of a system, a focus inward on the characteristics of a system that themselves generate or exacerbate the perceived problem.
  • Thinking of all concepts in the real system as continuous quantities interconnected in loops of information feedback and circular causality.
  • Identifying independent stocks or accumulations (levels) in the system and their inflows and outflows (rates).
  • Formulating a behavioral model capable of reproducing, by itself, the dynamic problem of concern. The model is usually a computer simulation model expressed in nonlinear equations, but is occasionally left unquantified as a diagram capturing the stock-and-flow/causal feedback structure of the system.
  • Deriving understandings and applicable policy insights from the resulting model.
  • Implementing changes resulting from model-based understandings and insights.

Mathematically, the basic structure of a formal system dynamics computer simulation model is a system of coupled, nonlinear, first-order differential (or integral) equations.  Simulation of such systems is easily accomplished by partitioning simulated time into discrete intervals and stepping the system through time one interval  at a time.  Each state variable is computed from its previous value and its net rate of change.

The simulation tools for System Dynamics have evolved considerably and today there are several different simulation tools that can be acquired to perform research and analysis based on system dynamics methods.

The Feedback Loop is the Key

Conceptually, the feedback concept is at the heart of the system dynamics approach.  Diagrams of loops of information feedback and circular causality are tools for conceptualizing the structure of a complex system and for communicating model-based insights.  Intuitively, a feedback loop exists when information resulting from some action travels through a system and eventually returns in some form to its point of origin, potentially influencing future action.  If the tendency in the loop is to reinforce the initial action, the loop is called a positive or reinforcing feedback loop;  if the tendency is to oppose the initial action, the loop is called a negative or balancing feedback loop.  The sign of the loop is called its polarity. Balancing loops can be variously characterized as goal-seeking, equilibrating, or stabilizing processes.  They can sometimes generate oscillations, as when a pendulum seeking its equilibrium goal gathers momentum and overshoots it.  Reinforcing loops are sources of growth or accelerating collapse;  they are disequilibrating and destabilizing.  Combined, reinforcing and balancing circular causal feedback processes can generate all manner of dynamic patterns.

For understanding, system dynamics practitioners strive for an endogenous point of view.  The effort is to uncover the sources of system behavior that exist within the structure of the system itself.

System structure

These ideas are captured in Forrester’s (1969) organizing framework for system structure:

  • Closed boundary
    • Feedback loops
      • Levels
      • Rates
        • Goal
        • Observed condition
        • Discrepancy
        • Desired action

The closed boundary signals the endogenous point of view.  The word closed here does not refer to open and closed systems in the general system sense, but rather refers to the effort to view a system as causally closed.  The modeler’s goal is to assemble a formal structure that can, by itself, without exogenous explanations, reproduce the essential characteristics of a dynamic problem.

The causally closed system boundary at the head of this organizing framework identifies the endogenous point of view as the feedback view pressed to an extreme.  Feedback thinking can be seen as a consequence of the effort to capture dynamics within a closed causal boundary.  Without causal loops, all variables must trace the sources of their variation ultimately outside a system.  Assuming instead that the causes of all significant behavior in the system are contained within some closed causal boundary forces causal influences to feed back upon themselves, forming causal loops.  Feedback loops enable the endogenous point of view and give it structure.

* References taken from “What is System Dynamics” authored at: http://www.systemdynamics.org/what_is_system_dynamics.html

Additional References

Ford, A. 2009. Modeling the Environment. Washington, DC: Island Press.
Forrester, J.W. 1961.  Industrial Dynamics. Cambridge, MA: The MIT Press.  Reprinted by Pegasus
Communications, Waltham, MA.
Forrester, J.W. 1969.  Urban Dynamics. Cambridge, MA: The MIT Press.  Reprinted by Pegasus Communications,
Waltham, MA.
Maani, K. E. and R. Y. Cavana. 2007.  Systems Thinking, System Dynamics: Understanding Change and Complexity.
Aukland: Printice Hall.
Morecroft, J. D. W. 2007.  Strategic Modeling and Business Dynamics: a Feedback Systems Approach. Chichester:
Morecroft, J. D. W. and J. D. Sterman, Eds. 1994. Modeling for Learning Organizations. System Dynamics Series.
Cambridge, MA:  Pegasus Communications.
Richardson, G.P.  1991/1999.  Feedback Thought in Social Science and Systems Theory. Philadelphia: University of
Pennsylvania Press; reprinted by Pegasus Communications, Waltham, MA.
Richardson, G.P., Ed. 1996.  Modelling for Management:  Simulation in Support of Systems Thinking.  International
Library of Management.  Aldershot, UK:  Dartmouth Publishing Company.
Richardson, G.P. and D. F. Andersen. 2010. Systems Thinking, Mapping, and Modeling for Group Decision and
Negotiation, Handbook for Group Decision and Negotiation, C Eden and DN Kilgour, eds.  Dordrecht:
Springer, 2010, pp. 313-324.
Richardson, G.P. and A.L. Pugh III. 1981. Introduction to System Dynamics Modeling with DYNAMO. Cambridge,
MA: The MIT Press.  Reprinted by Pegasus Communications, Waltham, MA.
Roberts, E.B. 1978, ed.  Managerial Applications of System Dynamics. Cambridge, MA: The MIT Press.  Reprinted
by Pegasus Communications, Waltham, MA.
Senge, P.M.  The Fifth Discipline:  The Art and Practice of the Learning Organization. New York:
Sterman, J.D. 2000.  Business Dynamics: Systems Thinking and Modeling for a Complex World.  Boston: Irwin
System Dynamics Review. 1985-present.  Chichester, U.K.:  Wiley-Blackwell, Ltd.
Vennix, J. A. M. 1996. Group Model Building: Facilitating Team Learning Using System Dynamics. Chichester:
Wolstenholme, E.F. 1990.  System Enquiry:  a System Dynamics Approach.  Chichester, U.K.:  John Wiley & Sons,