There are situations in which disruptions in the market can motivate interest in adopting new technologies, and I believe that there have been four phenomena that may be contributing to stimulating the technical curiosity of government data management professionals over the past few years:
- Social media – The popularity of social media channels like Facebook, Twitter, Instagram has created a wealth of streaming unstructured data (i.e. data that is not organized in a predefined manner or conform to a defined data model) that is ripe for analysis, especially in the public sector.
- Open data – The “government transparency” has spurred both public sector and private sector organizations to release data as well, creating a growing pool of data sources that are ripe for analysis.
- Hadoop, and other open source solutions – Hadoop is a great example of an open source software stack that organizations can implement to configure low-cost, high performance computing environments. Open-source Hadoop lowers the barrier to the creation of a very high-performance computing system that can quickly process vast amounts of data.
- Increasing number of data breaches – The alarming increase of both industrial and, state-sponsored data breaches creates a need for more precise and accurate analysis tools to monitor catastrophic data breaches.
Government organizations can radically improve their delivery of services to citizens by adopting techniques for analytics and becoming more data-driven and analysis-focused. Analytics applications provide analysts with tools and processes to examine massive data sets to unearth hidden patterns, trends, citizen preferences, and other useful information that can lead to better engagement with citizens, improved operational efficiency, savings through reduced costs, new revenue opportunities, and compliance with government rules and regulations.
Analytics for “smarter government” implies handling, processing, and sharing massive amounts of data. That data may be structured data, such as the data managed within a database system, or it might be unstructured data, requiring additional technologies for interpretation and conversion into a usable format. It also requires having the right business analysts tinker with different kinds of analytics solutions, such as predictive analytics (in which historical data is scanned to find behavior patterns that can be used to predict upcoming events or decisions), and prescriptive analytics (which use the discovered patterns) to alert analysts of frequently occurring events, such as:
- Analyzing transactions to identify and prevent fraudulent behavior as it is occurring;
- Monitoring suspicious network activities that show signs of a cyber attack or data breach;
- Supporting defense battlefield data fusion activities;
- Identifying potential security threats or safety threats by analyzing the potential connections linking crimes, locations, and individuals;
- Analyzing the provision of social services to improve processes leading to desirable outcomes in more specific social environments; or
- Analyzing the text in massive document repositories to protect those documents containing personally identifiable information.
An easy-to-use analytics solutions has the potential for creating value, but there are times when the desire to adopt solutions like big data analytics are hindered by the quirks of government management turnover, the long time-frames of the budgeting process, the reliance on legacy platforms, and the overall complexity of the government’s technology environment. Three key change management hurdles must be overcome to gain the advantage from analytics:
1) Transitioning from a siloed, project-oriented focus to a horizontal/enterprise focus,
2) Committing to being proactive about using analytics solutions, and
3) Overcoming the barriers that are common in public sector organizations regarding change management associated with adopting new technologies.
To overcome challenges of introducing big data analytics into government environments look for solutions that enable incremental adoption methods and technologies. At the same time, make sure that the transition allows you to overcome the change management issues. Choose tools with a low barrier to entry (such as open source tools like Hadoop, or tools that can be provided by vendors whose suites are already available for use within the agency), are easy to use integrate, can parse both structured and unstructured data, and that fit well into the existing enterprise technology architecture. By devolving an analytics capability, you streamline the adoption process and speed the time to value for making analytics a part of the government enterprise environment.
Of course, there is a lot more to think about when considering alternatives to support smart analytics for government. For a broader discussion, I invite you to read my paper “Analytics in Service of Smart Government”, where we explore the motivations for adopting analytics and provide some guidance for evolving capabilities in a way that does not disrupt day-to-day operations.
David Loshin, president of Knowledge Integrity, Inc. is a recognized thought leader in the areas of data quality and governance, master data management, and business intelligence. David is a prolific author regarding BI best practices via the expert channel at BeyeNETWORK and numerous books on BI and data quality. His valuable insights can be found in his book, Master Data Management, which has been endorsed by data management industry leaders.