Network Thinkers on Big Data Spying

While combing the Internet for research links the other night, specifically for text-mining and analysis software, I discovered Orgnet’s blog. Orgnet provides tech and consulting services for a variety of organizations and are long-time specialists in network analysis.

Inflow is their flagship product.

InFlowScreen formalinformal

The Network Thinker’s blog featured two posts in relation to incendiary press reports of NSA surveillance, especially data collection. I wanted to share them – excerpts below, followed by links to the full posts.

Network thinkers know that to effectively monitor a network, you don’t seek out the edge nodes, you find the central hubs and monitor them — through them you will have access to most of what is flowing through the net.  Below is a network map of the Autonomous Systems [AS] that form the backbone of the internet.  It is easy to find the central hubs in this network.  Load the 20,000+ nodes [each AS is represented by a node] and 48,000+ links [a data flow between two ASes is represented by a link] into asocial network analysis software program and have it run the Betweenness or Connector metric.  These two network metrics reveal how central any node is in keeping everything interconnected.  The hubs will be reveled by the network metrics.  In the diagram below the hubs are sized by their Connector score — the higher the score, the larger the node, and the more network paths flow through this node.  The colors are randomly assigned and have no meaning.

T N T — The Network Thinkers: Vacuuming the Internet.

We are trying to solve a societal problem by throwing technology at the problem.  We seem to do that with many problems these days.  Yet, technology can help us make sense of complex dynamics… if mixed with the social sciences.  The map above would have been real useful during the early months of 2001.  The network layout algorithms in the software allows us to see emergent structures, including key nodes and clusters, in this human network.  Once we have the map, we can measure the network, to find which nodes/links keep the network together.

Would the intelligence community have taken this map seriously in 2001?  Would other agencies have ignored the map because “it was not invented here”?  Good data and good analysis is not often utilized correctly when it moves from one organization/context to another.  Would the ABC agency know what to do with the analysis from the XYZ agency?

Big data implies with more, you get certainty.  Instead of certainty you might get the opposite because “the more” might actually include noise or dirt.  Noisy/dirty data, with the appearance of certainty/accuracy is the worst case scenario.  Good analysis requires good data, but big data alone is not sufficient for complex analysis of events that have not yet happened.  We need to be careful what and whom gets caught in our nets of surveillance.

This data was collected after the 9-11 attacks, and is a reasonable depiction of the AQ network in the USA before September 11, 2001.  Would this network map have stopped the 9-11 attacks?  

T N T — The Network Thinkers: Connecting the Calls.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s