jump to navigation

Computational Knowledge February 4, 2014

Posted by stewsutton in Architecture, Big Data, business intelligence, Collaboration, Computational Knowledge, Economics, Education, Knowledge Management.
add a comment

Right now we have a serious need for more students to fall in love with all of the STEM subjects, which fall into the categories of science, technology, engineering, and mathematics. We know these fields fuel economic growth, so training a STEM workforce has been recognized as a key goal in education policy. And yet, there is an enthusiasm gap in these subject areas, and nowhere is that more evident than math. In the United States, students don’t think they’re good at math, so they become quite adapt at hatting it. Many students it seems would rather eat broccoli than do math homework (and that is within a culture raised on fast-food where the concept of broccoli is viewed as utterly disgusting). Not surprisingly, these students are significantly underperforming. So how do we change this?

The way we teach math needs to be reinvented!

In a nutshell, “students need visual and interactive curriculum that ties into real life.” Nowhere is the power of how good mathematical instruction better demonstrated than within the environment of Wolfram Mathematica.

Properly teaching math breaks math down into four components:

1. Posing the right questions
2. Turning a real world problem into a math formulation
3. Computation
4. Turning a math formulation back to the real world, verifying it.

We spend perhaps 80 percent of the time in math education teaching people to do #3 (computation by hand) — This is the one step that computers can do better than any human after years of practice. Why are we doing this?

Instead, let us use computers to calculate. After all, that’s the math chore we hate the most. It may have been necessary to teach this skill 50 years ago. There are certainly a few practical examples of how hand-calculation can be useful today.

The goal of the Wolfram technology is to collect and curate all objective data; implement every known model, method, and algorithm; and make it possible to compute whatever can be computed about anything. We see this technology achieving some pretty spectacular levels of performance in Wolfram|Alpha and within Mathematica as well. Integrating this form of computational knowledge within classrooms is going to have a powerful multiplying effect on student performance and understanding as they orient themselves to solving real-life problems with the power of computational knowledge.

Health Care – The “simple” economics July 15, 2013

Posted by stewsutton in Economics, Fitness, Healthcare, Politics.
add a comment

Within the U.S.A. there is an ongoing debate/conversation/argument/fight related to healthcare and how we can make it better.  While there are numerous complexities to the system as it currently exists, there are three principles that can be referenced in a common sense discussion of the healthcare topic:

  1. Cost
  2. Quality
  3. Access

In a book written by Rob Rodin titled “Free Perfect and Now” (the three insatiable customer demands), we see a simple economic example of three principles that you can never have completely or 100% at the same time.  A thing can be free, and you can possibly even get it immediately (now), but it will not be perfect.  For it to be perfect, it would need to be able to address a multitude of different needs.  To make something free and available now, requires that it address a specific (limited) set of needs.

In the same way we can look at the other dimensions of perfection which if we strive to make a thing “perfect”, it will likely not be free and in the quest of specifying its “perfectness” we have guaranteed that it will not be available now.  So with these things in mind, lets consider the simple principles of healthcare economics (cost, quality, and access).  They follow the same dimensions Free-Perfect-Now as described in Rob Rodin’s book.

For many years America had the best healthcare system in the world.  And even today, one can say with confidence that the quality of American healthcare is the highest in the world.  However in our quest for quality, we have introduced some significant complexity into the system.  This it seems is driven by a belief that the healthcare system can be engineered at a massive national level.  There it seems is where we have made a bet that is not working out quite like we planned.

Not that long ago (in the 50’s and 60’s) there was a lot of research going on in healthcare and that research found its way into the practice of healthcare through the individual motivation of practicing doctors, nurses, and other healthcare professionals.  Practicing physicians learned new things and they applied these new things to improving their patients outcomes.  Things were working pretty well.

Then it seems a movement arose to try and “assure” that all medical professionals were following the same quality guidelines.  And moreover, these quality guidelines would need to be enforced.  About the same time there is a desire to make it possible for patients to be able to afford the really expensive procedures so we see medical insurance become a commonplace item and increasingly an item that is a key part of a compensation package when working for a company. Well now all companies did this and that is where we start to see some inequality in the system.  Not everyone can “afford” to get “access” to the higher quality of care that is offered to those with insurance.

In an attempt to address the “cost of access” problem, the concept of managed care is introduced with the idea that a free-market system (at a bigger scale) can drive efficiency and enable the delivery of high-quality (but expensive) procedures at a lower price point and therefore enable access to patients that could not previously afford such services.

And while all of this is happening, we see an increasing burden and financial catastrophe taking shape within American hospitals.  They are using the high-end medical procedures being delivered to finance “free” services to those that cannot afford to pay.  To a great extent, the hospitals become the front-lines for delivery of free healthcare to those in need without the ability to pay.  So the economics of this strange configuration drives up the costs of the “paid” procedures and since there are now layers of administration and oversight that must be compensated within this delivery model, the costs further rise.

Then along comes the affordable healthcare act (Obamacare).  This system is proposed to solve the skyrocketing costs of healthcare while also improving access to 100% of the population and at the same time making sure that the quality stays the same.  In effect the Affordable Healthcare Act seeks to make healthcare free, perfect, and now.  Well we have seen that is just not possible and trying to engineer the impossible on a grand national scale makes it even less likely.

So my suggestion is to go back to the way we used to practice healthcare before the days of managed care, big insurance, and lots of government administrative regulations.  We can have local doctors that get good educations, keep abreast of new things, and at their individual pace, factor these new things into their medical practices and thereby improve outcomes for their patients.

With such a system we will certainly have “good doctors” and “bad doctors” and the resulting outcomes of this disparity in quality.  But since we cannot engineer 100% success in cost, quality, and access, this highly resilient and distributed approach can more effectively serve the national needs by focusing on the local community needs.  It is a simple approach, and we need not make the problem complex just so we feel compelled to solve it with a massive national program.  The solution it seems is to back away from the massive national program and to return to basic principles that have always worked best in the aggregate by fostering a provider-client relationship that is rooted in local community.

Realities of the New Work Environment April 15, 2013

Posted by stewsutton in business analytics, business intelligence, Cloud, Cloud Computing, Collaboration, Communications, Community, Data Portability, Economics, Information Policy, Information Technology, Knowledge Management, Software.
add a comment

Trends such as globalization, economic change, externalization, and consumerization are creating  new realities in the modern information workplace.  Here are four workplace realities that are already having an effect on the way we get things done.

1. Greater Interdependence – Employees collaborate many more individuals in their day-to-day work than they did just a decade ago (typically ten or more). As a result, nearly one-half of an employee’s impact on business unit profitability comes from network performance—the ability to help others perform and be helped by others. In contrast, in 2002, nearly 80% of an employee’s impact came from individual task performance. Although network performance is vital, only 20% of employees are effective at it. The way IT supports enterprise collaboration must change as IT adopts techniques to understand and support the needs of teams and individuals.

2. Frequent Organizational Change – Clearly organizations have never stood still.  However, a majority of employees feel that the rate of change is accelerating. Since 2010, the average employee has experienced major changes including:  reorganizations, strategy revisions, or new leadership, at a cycle of roughly every seven months. This state of near continuous change shortens business partner time horizons and puts a premium on responsive IT planning and budgeting. It also undermines efforts to encapsulate business process in enterprise systems and increases the value of integration.

3. Greater Knowledge Intensity – Ah, the Knowledge Management stuff…  An increasing percentage of employees (over 80%) are conducting knowledge work that requires analysis and judgment. Knowledge work is becoming ubiquitous because of transaction automation and the emergence of “big data,” In addition, business volatility means that even when transactions remain manual, there are plenty of exceptions that require analysis and judgment to resolve. Information technology investments are already changing to reflect this trend, with more money being spent on analytics and collaboration and less on process automation.

4. More Technology Choice – It is commonly reported that a serious majority (nearly two-thirds) of employees use personal devices for work purposes.  This is huge!   However, this transition to device consumerization is only the starting point. After BYOD comes BYOI, BYON, and BYOA; bring your own information, networks, and applications. Almost one-half of all employees already use external, unofficial information sources for work purposes,  about a quarter of employees source their own collaboration and networking tools, and a fifth of employees use their own analytic tools. Although BYO has risks, it cannot be stopped. Managed correctly, it can provide faster access to new capabilities and a better fit with individual employee workflows.

Big Data Portfolio April 12, 2013

Posted by stewsutton in Cloud Computing, Economics, Financial, Information Technology, Investment.
add a comment

Currently assembling a “big data” portfolio of companies that are invested in technology and services for the processing and storage of “big data” OR using digital information directly in the formation of manufactured goods.  All of the usual suspects from Amazon, to Google will be in the mix, and there are the storage companies like Western Digital, but there are others like IBM which bring the processing technology of Watson and the commitment to investing in high-performance flash storage within their enterprise systems.  For the use of digital information there are companies that are vested in the growth of 3D printing with multiple materials.  It’s an interesting cross-section of technology companies that can continue to be a significant component of the information economy.  So here is the list of companies thus far.

Company Ticker What do they sell? 52 Week Range
3D Systems DDD The Company through subsidiaries designs, develops, manufactures, markets and services rapid 3-D printing, prototyping and manufacturing systems and related products and materials. Website: http://www.3dsystems.com/ $15.40 – $47.99
Western Digital Corp. WDC Western Digital Corp. designs, develops, manufactures, and sells hard drives for data storage. Website: http://www.westerndigital.com/ $28.31 – $53.75
Stratasys SSYS Stratasys develops, manufactures, and sells 3-D printers that create physical models from computerized designs. Website: http://www.stratasys.com/ $34.50 – $92.30
Corning GLW Corning makes specialty glass and ceramics that are used in everything from flat-screen TVs to optical fiber to biosensors for drug research. Website: http://www.corning.com/ $10.62 – $14.58
Gartner IT A research & advisory firm that helps executives use technology to build, guide & grow their enterprises. The Company offers independent & objective research & analysis on the information technology, computer hardware, software, communications etc. Website: http://www.gartner.com/ $39.50 – $57.61
Teradata TDC Provide data warehousing services Website: http://www.teradata.com/ $50.40 – $80.97
Google GOOG So dominant it’s a verb, Google is the leading internet search provider and uses its proprietary algorithms to offer targeted advertising. Website: http://www.google.com/ $556.52 – $844.00
Oracle Corp. ORCL The Company develops, manufactures, markets, distributes and services database, middleware and applications software that helps organizations manage and grow their businesses. Website: http://www.oracle.com/ $25.33 – $36.43
Amazon AMZN Once simply an online purveyor of books, Amazon.com has become a marketplace for just about anything you’d want to buy. Website: http://www.amazon.com/ $183.65 – $284.72
Apple AAPL From iPods to iPhones to MacBooks, Apple uses its “think different” approach to reframe computing, communication, and more. Website: http://www.apple.com/ $419.00 – $705.07
Nvidia NVDA The company deals in world-wide programmable graphics processor technologies. Its major product-line operating segments are: graphics processing units, media and communications processors, handheld and consumer electronics. Website: http://www.nvidia.com/ $11.15 – $15.22

Whats New In The Cloud? December 10, 2012

Posted by stewsutton in Cloud, Cloud Computing, Data Portability, Economics, Information Technology.
add a comment

The Cloud.  That vast and curious location that seems to be good for everything.  We can store our photos, our books, our music, and our various working files there.  Beyond all of that data, we can also do real computing in the cloud.  The sort of computing that we used to accomplish on large corporate computing infrastructure or even on our own personal computers.  So why does this matter?

Well, the changes and transformation of services that are being made available to both companies and individuals are affecting the way we use our computers, our laptops, our tablets, and our smart phones.  Consider some of the changes that have already been adopted by many:

  1. Keep your music on iTunes and use iTunes match to sync all of your songs across all your devices anytime and anywhere they are connected to the Internet network.  This is the cloud jukebox that you own and it is ready to play your music anytime.
  2. Buy your books on Amazon and you have a permanent digital library that spans your iPad, your Kindle, your iPhone, your computer, and any other digital device you own.  Download any of your “books” at any time from the Cloud Library and enjoy reading it on your device.  As you switch between devices the cloud keeps your location synced so that you easily resume where you left off.
  3. Photo services like Flickr and others allow you to upload and stream your photos as needed across any of your digital viewing devices.  This is your photo album in the cloud and there are many choices for your digital photo albums.  Many seem to even use services like Facebook and Twitter as a way to store and share their photos – especially photos captured on smart phones.

With these changes having become commonplace, might we consider the digital cloud to become our infinite network disk drive and the home to our favorite applications?  Probably so.  This will have the biggest impact in how we “manage” our data.  Not that long ago we probably had our important data on a local computer that was in our office or in our home.  If we were disciplined, (and cautious), we likely made some effort to occasionally back-up or copy the important information onto another computer disk so that we could recover if our computer “had a problem.”

One of the major differences in our day-to-day relationship to our data that is cloud-based is that we are not typically going to be given simple options to “copy” and “backup” of that data to our local disk.  Some services provide this and others take it a step further by offering cloud-based backup of data.  If you are with a top-tier provider of applications, and data services (e.g. Amazon, Apple, Google, etc.) your data is unlikely to disappear due to bad procedures or failed equipment.  It’s also increasingly common for new companies that offer compelling new services that sit atop the infrastructure of a company like Amazon.  So instead of reinventing all of this cloud infrastructure and operations, the new company leverages what is already a proven reliable asset.

Each of us will likely be offered new services by the top-tier cloud providers in the coming years.  These services will range from banking services and digital safety boxes to high-end applications that we generally associate with a dedicated computer.  The difference is that we will “rent” the services in much the same way that we “rent” services like phone minutes and cable TV channels.  Data portability will be one of the important characteristics that will separate the better providers from the rest.  Making sure that you can “get a full copy” of your data and move it to another cloud provider will be a key criteria for selecting a cloud provider.  As we move toward more and more cloud services, data portability should be top-of-mind for everyone.

 

 

The 21st Century Cyber-Economy June 1, 2012

Posted by stewsutton in Economics, Information Policy, Information Technology, Knowledge Management, Politics, Security.
add a comment

Disclaimer:  This post is composed based on a review of publicly available information and a reflective commentary on events widely reported within the press.  No “inside” information or perspective has colored or enhanced the commentary presented here, and these opinions are solely and completely associated with the author.

As a technologist with an interest in preserving knowledge and improving the way we work with information, I am distracted by the subject of cyber warfare.  Its all over the news and hard to ignore if you have access to the Internet, TV, or national newspapers.  According to a report in the New York Times, President Obama accelerated a plan to cripple Iran’s uranium enrichment program by hacking into the program’s home base, Natanz. Although the program began in 2006, the Times reports that Obama pushed increasingly aggressive attacks, even in his early days in office.

The New York Times writer David E. Sanger claims that the U.S. developed a worm with Israel called Stuxnet.  This is a very sophisticated virus – so advanced that Symantec and other security experts could not figure out who made it. The Stuxnet worm crippled 20% of the Iranian centrifuges at Natanz.  Analysts and critics are still debating whether Stuxnet seriously impeded Iran’s nuclear ambitions or just slowed them down a bit.  However, nobody is denying the serious implications this tactic has for modern warfare.

This TED Talk video from March 2011 describes how when first discovered in 2010, the Stuxnet computer worm posed a baffling puzzle. Beyond its sophistication at the time, there loomed a more troubling mystery: its purpose. Ralph Langner,  a German control system security consultant. and his team helped crack the code that revealed this digital warhead’s final target. In a fascinating look inside cyber-forensics, he explains how — and makes a bold (and, it turns out, correct) guess at its shocking origins.

The Stuxnet cyber attack represents a new formally established method of warfare.  And more recently, the equally insidious Flame virus is capturing the attention of cyber-watchers. There is still speculation on who originated this specific virus.  But the frequency of cyber-actions appears to be on the uptick.

The following references are links to press sites where the ownership and origin of Stuxnet and other viruses within this emerging Cyberwar are being declared.

  1. Slashgear
  2. Washington Post
  3. PC Mag
  4. Ars Technica
  5. Reddit
  6. Telegraph
  7. Extreme Tech
  8. New York Times
  9. Tech Crunch
  10. Yahoo News

With all of this making big news today June 1, 2012, we have some interesting counter-dialog that emerged in a publication on May 31, 2012 from Adam P. Liff, a Doctoral Candidate in the Department of Politics at Princeton University.  The title of the article that was published in the Journal of Strategic Studies is: Cyberwar: A New ‘Absolute Weapon’? The Proliferation of Cyberwarfare Capabilities and Interstate War.  Within the article the central objective is to explore the implications of the proliferation of cyberwarfare capabilities for the character and frequency of interstate war.  The contrarian view expanded on within the paper is that cyberwarfare capabilities may actually decrease the likelihood of war.

In one hypothesis, computer network attacks (CNA) represent a low-cost yet potentially devastating asymmetric weapon.  The hypothesis is that asymmetric warfare will increase the frequency of war by increasing the probability of war between weak and strong states that would otherwise not fight due to the disparity between conventional military strength.

A second hypothesis put forward by Liff is that the plausible deniability and difficulty in attributing cyberatacks could lead potential attackers to be less fearful of retaliation, and thereby use CNA where they would not dare attack with conventional weapons.  As it took a while for the disclosure of the Stuxnet virus to be attributed, it is also likely that the time required to attribute an attack will accelerate.  Its all a matter of computer algorithms, processing power, and effective application of cyber-forensics.  Those with the bigger computers, better algorithms, and smarter scientists have a decided edge in conducting an effective investigation of the cyber war scene of the crime.

Another hypothesis put forward by Liff is that the difficulty of defending against cyberattacks will render states exceedingly vulnerable to surprise attacks.  And since state will not be able to afford to attack first, the offensive advantage of CNA may increase the frequency of preemptive war.  At the present time however it seems that for a few more years, the cyberwarfare domain will concede an advantage to actors that have considerably more resources; thereby offering an offensive advantage to those actors.

A summary conclusion offered by Liff is that in some situations CNA as an asymmetric weapon may actually decrease the frequency of war by offering relatively weak states an effective deterrent against belligerent adversaries.  While this opinion is interesting, it seems unlikely since the ability for weak states to guard their secrets may directly affect the confidence in another states success of a preemptive cyberattack.

So with all of this “lead-up”, we can consider the implications of this “cyber stuff” and how it affects our day-to-day information economy and the way we work.  To provide further foundation to this discussion, we need to acknowledge the current information ecosystem that is increasingly prevalent within our modern economy and the relationship of several major components within that ecosystem.  First let us simply describe the information ecosystem:

  1. Wireless access points
  2. Persistent network connections
  3. Data and applications in the cloud

To provide a more visual example of these three elements of our modern information ecosystem in action consider the person using an iPhone/iPad (access point) over a WiFi in a coffee shop (persistent network connection) and updating their Facebook or Twitter account with new information (data and applications in the cloud).  These actions and examples go far beyond the simple consumer visualization here, and are increasingly the “mix” for business applications.  We are rapidly moving from desktop computers to laptops to tablets and for the majority of information transactions in the future, the mobility will extend into wearable computing devices like fancy eyeglasses (Google Glasses).  This increased mobility is irreversible and the supporting technology that supports and speeds this mobility will win in the marketplace and it will further establish fully mobile computing as the way we transact with information.  The difficulty however is that the increased mobility introduces an expanded reliance on networks and big data/app centers where all the digital stuff we work with is stored and served up.

So while we had the “big 3” automakers back in the 60’s and 70’s, we now have the “big 3” digital service providers of the 21st century in the form of Amazon, Apple, and Google.  You have got your iPhone, iPad, Kindle, or Android device accessing an increasing portfolio of content and applications from the “big 3” digital service providers (think Kindle Books, Apple iTunes Music, and Google Drive for your documents, and other digital data).

So while the cyber threat is spoken about in relationship to national infrastructure and associated information assets, the threat to basic day-to-day personal information systems is also a reality.  Now it would not be a national security incident for people to lose access to their music on iTunes, or their picture albums on Flickr, or their collection of novels from the Kindle store, it is the increasing ubiquity within which we traverse this collective set of services that can be forecasted as a more complete reliance on similar arrangements for everything we do within the modern information economy.  Digital wallets in our mobile phones, online banking from wherever we are, controlling access to systems within our homes, and even the education system with increasing emphasis toward online learning and instruction.  It is a pretty safe bet that the information ecosystem will be pervasive and all information will need to successfully traverse this ecosystem.

Our daily lives (both business and personal) will become increasingly dependent on the reliability and performance of this information ecosystem.  It is a relatively simple inference to conclude that the information ecosystem will be under constant attack in the “cyber domain” and that will affect a more complex relationship between government-based protection of our “cyber border” in a manner not that far removed from how we protect our physical border.  Will this have a profound and lasting effect on the way we interact with our information?  Certainly it will.  There will be less anonymity within information transactions (since strong tie-ins with verifiable identity are foundational to improved security.  So you will be leaving digital “breadcrumbs” of digital transactions wherever you go within the cyber domain.  How much of this digital history will need to be publicly accessible, and how much will remain private will be the subject of much dialog and innovation, and services, and likely government policy.  It is going to be, no, it already is a brave new world and the digital cyberwar will just be another aspect to the 21st century way of life.

Are the Economics Viable? December 23, 2011

Posted by stewsutton in Communications, Economics, Education.
Tags: , , , ,
4 comments

There are enormous changes taking place in many businesses and across multiple markets.  One need only look at a newspaper article or magazine or web-based media to see this rapid change.  However within the rush to become more cost effective in how we execute our business, we should also carefully consider the implications of making reductions – sometimes significant reductions in areas that appear to be non-essential.  Even that phrase non-essential has a rather strange ring don’t you think?  It sort of implies that when we are doing good, we can waste resources in areas that are non-essential and its only when things get tight, we must be realistic in our allocation of resources.

At some level its as if we need to go on a resources diet based on a season or two of overindulgence.  This is a cycle that seems to repeat across all industries and throughout history.  We never seem to learn from our past – even with its record being so clear.  A couple of examples come to mind that will illustrate the poorly planned cutting taking shape within two distinctly different industries.

The first example is within the banking industry.  One of the nations leading banks is making some dramatic adjustments to its allocation of internal resources (in the form of staff reductions) where the role and function of this staff is directed squarely at the quality of the banks communications.  That is to say, in the spirit of increasing the potential for more profit, the bank is going to reduce the clarity of its customer communications.  Now this is the sort of stuff that typically does not make headlines and it certainly would not be a candidate for communications to the customers of the bank – ironically because those individuals will no longer be there to write this correspondence.  Some would argue that smart people in the bank’s workforce will just add corporate communications to their list of existing tasks, but when was the last time you considered that your bank’s correspondence was not long enough – too short a narrative to really matter.  The well known objective of writing the short letter requires work – no matter what the profession.  So the customers of this bank can soon expect to see some longer letters, or if the letter is short, it may lack some clarity in its intended purpose.

Another example of misplaced economic choices is within the collective set of campuses that comprise the University of California system.  Once considered an incredible value, now each dollar spent on a UC education is increasingly consumed by layers of administration that seek to assure that the delivery of education meets all of the criteria set by another group of administrators.  Gone are the days when the educational dollar paid for faculty, facilities, and supplies.  We now live at a time when the layer upon layer of politically correct bureaucracy takes priority to that service for which the bureaucracy is subordinate.  Its not the quality of what we teach and the value of that instruction in relationship to a persons skills and value to an employer upon graduation – but rather the more important priority is that we have internal reviewers, compliance administrators, and a significant percentage of the university budget directed toward being compliant to a way of delivering education.  This overhead raises educational expenses and take the attention away from learning.  So students get less value and it costs them more. Where is the sound economics in that prioritization?  And could sound economics even be possible within an educational institution where the administrative component setting the priorities would need to diminish itself to achieve a more effective solution.

Other businesses are going through similar difficulties.  Many organizations will make strange choices when confronted with reducing budgets and increasing operational costs.  Will R&D be sacrificed because its benefits are not immediate?  Will processes be restructured in a way that diminishes a connection between the provider and the customer of the product or service?  Will we rely too heavily on technologies like social media to establish and maintain a connection where other options should be given priority?  Keep your eyes on the choices taking shape in your workplace and speak up if things seem to be drifting away from basic common sense.  Everybody has potential to offer perspective on the more viable solution that follows sound economics.