July 13, 2017
Megabytes. Gigabytes. Terabytes. Petabytes. Exabytes.… the world’s data is constantly growing and is expected to double every two years as technology permeates nearly every aspect of our lives. Both commercial and government organizations are racing against time to modernize their information systems so they can maintain and scale their operations, simultaneously maintaining data integrity and minimizing business disruption.
Before an organization can catch its breath, it must then use these new systems to harness disparate data into manageable, consumable chunks that enable analysts and executives to understand manage business risk. These efforts can be incredibly taxing on internal resources.
Enter PlanetRisk’s managed service. We provide an architectural framework that offers clients the most complete understanding of their data by streamlining and joining it with ours in a very large graph that brings previously-unknown connections between people, places, things, events, and more into focus. One of the things we’ve spent a considerable amount of time on getting good at is placing our clients’ data into spatial context. Our discrete global grid (DGG) model enables us to package and contextualize data at global, regional, provincial, or hyperlocal levels- down to a square kilometer. Assigning spatial context to data can help answer business risk questions such as:
PlanetRisk puts analysts in the loop – above the data – allowing them to think creatively and share their interpretations with their stakeholders.
One client we work regularly with publishes analyses for public consumption as the authoritative source on a polarizing topic. Rapid information retrieval is critical to their maintaining their operational tempo – and their reputation. They came to us with the following challenge: build an entirely new database from roughly 30 years’ worth of data stored on various SharePoint sites, MS Access databases, and individual employees’ hard drives; establish a schema that makes sense to their mission; hide sensitive data from those without a need-to-know; make it easy to perform advanced searches and update records as necessary; and design cool, exportable charts to embed on the customer’s website to show off their analysts’ discoveries.
At PlanetRisk, we’ve intentionally assembled a tech stack that accommodates the varying needs across an enterprise, like accepting data of all types, and enabling custom access controls so our clients can select what each user or department gets to see. Once all the proper data connections were put into place, our user group could log into a clean, Google-like interface and quickly access records from a single, consolidated knowledge base. We built a ‘dossier,’ or profile, for each person, organization, location, and event the customer has documented over the years and displayed other people, places, and events their original search subject is tied to. If a user searches a name and has additional information on that individual, she can add to that record, save, and share with her colleagues. This collective intelligence tool has provided the team with far greater visibility of their data as an organization and a sustainable knowledge base that remains in place as staff changes occur.
As a service provider, we appreciate the ability to achieve big data ingestion, but we also know that value lies in how well we can analyze and visualize that data for our clients. We find humans don’t like reading big lines of code (or small lines of code, for that matter). Whether you’re a CEO, chief security officer, intelligence analyst, journalist, or lobbyist, your ability to tell a clear and concise ‘story’ that’s supported by data is probably critical to your success.