Did IBM use “Face Capture” to Surveil Boston Calling? What is “Face Capture”?

City of Boston Public Event Mgmt Demo handout, prepared by IBM for the City of Boston in anticipation of live implementation in May of 2013.

A large element of what was troubling to readers with whom I spoke about our disclosures at DigBoston was so-called “face capture” and “face tracking” software employed on live video surveillance of Boston Calling attendees at both music festivals last year.

In emailed comments published by the Boston Globe days before the most recent festival, IBM addressed reports of the technology’s use.

City of Boston Public Event Mgmt Demo handout, prepared by IBM for the City of Boston in anticipation of live implementation in May of 2013.
“City of Boston Public Event Mgmt Demo handout,” prepared by IBM for the City of Boston in anticipation of live testing in May of 2013.

Writes the Globe:

IBM did not return calls seeking comment about the demonstration, but in an e-mail, company spokeswoman Holli Haswell said neither “face capture . . . nor facial recognition” were used at the event.

Out of consideration for the privacy of the thousands of individuals whose images were captured and retained for more than year, flapping in the digital breeze, as it were, and additionally out of a desire to respect substantive security concerns, we eschewed publication of the original files. We did, however, provide exhaustive sourcing, and take back not a single line of our original work. Dan McCarthy, Editor of DigBoston, had this to say about the “Boston Trolling” series’ significance:

“The discovery of this story, and its ensuing publication in the print and online pages of DigBoston, has been a watershed mark for journalism, but not just within Greater Boston. In the weeks that followed the initial chapter, we have been approached by print publications and online news outlets, both national and international. And there’s good reason for that. The microcosm this story represents seems to have tapped into a larger encroaching sense of diminishing privacy in the new socio-cultural frontier.”

There are in fact multiple references to “face capture,” both in the preparation for and in the “Post-Mortem [sic]” of the Smart Surveillance Solution pilot at Boston Calling. (note the timestamps in the above image).

In preparation for the first festival, at least three cameras were configured with “face capture” as a Use Case.

From IBM spreadsheet "CoB Analytic Cameras."
From IBM spreadsheet “CoB Analytic Cameras.”

“Face Capture” in video surveillance is not a type of software, but a necessary element in multiple analytic profiles. It refers to exactly what it literally describes, the photographic capture of a person’s face. IBM’s denial is particularly hollow, in that it cannot only be refuted directly by its own documentation, but by a simple observation of the visual media that its employee stored unsecured on his private server.

“Face Capture” can be done on any video where a subject’s face appears in full or in a predetermined percentage in at least one frame – software can be configured to do this automatically according to certain rules determined by the programmer and operator, such as in the performance of a “Near-field People Search.” Some software, including IBM’s, can “learn” what sort of searches it should perform and deliver automated alerts when it gets a “hit.”

IBM and the City of Boston most certainly performed “face capture” at Boston Calling – whether or not they used facial recognition technology remains to be settled. Both have denied this as well, while the same IBM employee’s documentation makes note for integration of a “third-party facial recognition engine.” Boston Police denied their involvement in the program entirely, and shortly after pictures of BPD officers observing the software in live use were published by Vice’s Noisey. I’m publishing all of the “Command Center” photos with BPD in them for dramatic effect below.

DigBoston’s McCarthy also noted the attitude of acceptance of privacy encroachments in the wake of the Marathon bombings, and how revelations like those published in Dig are enabling a “sudden shift to one of justifiable skepticism. It may be one of the last weapons the Fourth Estate has in these increasingly uncertain and nervous times.”

05252013_1 05252013_2 05252013_6 05252013_10

Research Hacking – Searching for Sensitive Documents on FTP; Captchas and the Google Governor

If you want to find *sensitive documents using Google search (*documents with impacting information which someone does not want revealed, more or less), I’ve found that in addition to targeting queries to search for specific domains and file types, an alternative and potent approach is to restrict your results to files residing on an ftp server. 

The rationale is that while many allow anonymous log-in and even more are indexed by Google, FTP servers are used more for uploading and downloading, storing files than viewing pages, and typically house more office-type documents (as well as software).  As limiting your searches to ftp servers also significantly restricts the overall number of results to be returned, choice keywords combined with a query that tells Google to bring back files that have “ftp://” but NOT “http://” or “https://” in the url yield a high density of relevant results. This search type is easily executed:

Screenshot - 12032013 - 08:10:35 AM

A caveat one encounters before long using this method is that eventually Google will present you with a “captcha.” Many, many websites use captchas and pretty much everyone who uses the internet has encountered one. The basic idea behind a captcha is to prevent people from using programs to send automated requests to a webserver, they are a main tool in fighting spam by thwarting bots that mine the internet for email addresses and other data, and which register for online accounts and other services en masse. The captcha presents the user with a natural language problem which they must provide an answer to.

Google is also continuously updating its code to make it difficult to exploit Google “dorks,” queries using advanced operators similar to one used above (but usually more technical and specific). Dorks are mostly geared toward penetration testers looking for web application and other vulnerabilities, but the cracker’s tools can easily be adapted for open source research.

Screenshot - 12032013 - 08:13:41 AM

Unless you are in fact a machine (sometimes you’re a machine, in which case there are solutions), this should be easily solved; however lately, instead of returning me to my search after answering the captcha, Google has been sending me back to the first search page of my query (forcing me to somewhat start the browsing process again and to encounter another captcha). I’m calling it a Google Governor, as it seems to throttle searchers’ ability to employ high-powered queries.

The good news is that the workaround is really just smart searching. One thing you’ll notice upon browsing your results is that dozens of files from the same, irrelevant site will be presented. Eliminate these by adding -inurl:”websitenameistupid.com” (which tells Google NOT exactly “websitenameistupid.com” in the url). Further restrict your results by omitting sites in foreign domains (especially useful with acronym-based keyword searches): -site:cz -site:nk.

When you find an ftp site which looks interesting, copy and past the url into a client like Filezilla for easier browsing.

To give you an idea of the sensitivity of documents that can be found: One folder was titled “[Name] PW and Signature,” which contained dozens of files with passwords as well as .crt, .pem, and .key files; another titled “admin10” contained the file “passwords.xls.” This was the site of a Department of Defense and Department of Homeland Security contractor – the document contains the log-in credentials for bank accounts, utilities, and government portals. This particular document is of more interest to the penetration tester; for our purposes it serves as a meter for the sensitivity of the gigabytes of files that accompanied it on the server. The recklessness of the uploader exposed internal details of dozens of corporations and their business with government agencies.

The hopefully sufficiently blurred "passwords.xls"
The hopefully sufficiently blurred “passwords.xls”

*As of this writing, the FTP mentioned above is no longer accessible

Blackwater GSA Schedule 84 Security Services Pricing Catalog

Screenshot - 12072013 - 07:37:51 AM

Found included in documents I’ve been posting here and those published on the Declaration, a nice reference: the Blackwater GSA Price List (General Services Administration Schedule, GS-07F-0149K) 30 August 2006. You’ll remember Blackwater or can use your Googler.

It’s an interesting read – Tier 1 a nice place to be, at over $1,000 a day.

 

Screenshot - 12072013 - 07:36:11 AM

Read here

The Department of Defense Information Operations Condition (INFOCON) Decision Matrix

Screenshot - 11212013 - 03:36:23 PM

Employing meta-search methods for online research about which I have been tweeting and writing, I found myself in possession of a copy of the Department of Defense Information Operations Condition, or INFOCON, Decision Matrix. “INFOCON” is a threat condition like DEFCON, with numbered tiers, based on an intelligence assessment of active malware and its likelihood of disrupting connectivity/functionality.

There is much more where this came from. – K

Recommended Read: AI and the Future of Free Journalism

1. Journalism’s first obligation is to the truth;
2. Its first loyalty is to the citizens;
3. Its essence is a discipline of verification;
4. Its practitioners must maintain independence from those they cover;
5. It must serve as an independent monitor of power;
6. It must provide a forum for public criticism and compromise;
7. It must strive to make the significant interesting and relevant;
8. It must keep the news comprehensive and proportional;
9. Its practitioners must be allowed to practice their personal conscience.

 “Principles of Journalism,” Pew Research Center’s Project for Excellence in Journalism (PEJ) and the Committee of Concerned Journalists

Reading this now, it contains important considerations for both reporters of and consumers of news. “AI and the Future of Free Journalism”, from the abstract:

Interaction between journalism, the Internet and social communities is familiar and intensely discussed, helping us understand how journalism can raise our collective intelligence. We discuss how artificial intelligence (AI) will add to that picture and thus influence the future of journalism. We describe ‘Digital Identities’ and their future interaction with journalism. We summarize state-of-the-art AI methods usable to establish the ‘DNA’ of journalistic content, how matching that content with digital identities enables behavioral targeting for consumer engagement. We review the driving forces such procedures may introduce to journalism and show an example of a journalistic behavioral-targeting engine. We highlight some concerns and discuss how using digital identities and AI can be complex versus current journalistic principles. We stress the need for ethical principles in using digital identities in journalism, and suggest examples of such principles. We issue a call for stakeholders to jointly explore the potential effects of AI algorithms on the journalism profession and journalism’s role in a democratic society and suggest questions to be explored.

Forensic Indexing, Metadata, and the DVIC Privacy Policy

When doing research on a subject that has some measure of obscurity by design, such as the fusion center in Philadelphia, the Delaware Valley Intelligence Center (DVIC), I often find the only way to fill in the gaps is to “data-mine” for documents. I use quotes, because data-mining strictly involves aggregating and analyzing more fragmented bits of *data, I deal more in *information, and data-mining usually applies to a much more intensive level of computation applied to a much larger corpus to be processed than I will discuss here.

You can get hands on with data mining. This is Tree-Map, I use a program called XBase. They're similar, great for browsing structured data  like xml.
You can get hands on with data mining. This is Tree-Map, I use a program called BaseX. They’re similar, great for browsing structured data like xml.

A more appropriate term would be “forensic indexing,” in that I am applying basic methods of digital forensics like metadata extraction to a general knowledge management system for large collection of documents, too large realistically to open one by one. And I’ve just made it sound more organized than it usually is.

In the case of the DVIC what this meant was using an application which automates queries to metasearch engines as well as enumerating a specified domain to find relationships and other information. I used FOCA. I saved the documents that were the result of this search in separate folders according to which domain I had chosen for the search. I collected around 1800 documents.

I then run a simple command line program called pdfgrep, I used the command pdfgrep -n -i “dvic” *.pdf to bring me a list displaying every line in every pdf file in the same directory containing the phrase “dvic,” tagged with file name, page of line, and ignoring case. One such query returned:

[filename]pg#: "text"
[filename]pg#: “text”

As you might imagine if you have followed the Declaration’s coverage, I was a bit confused. I went to the corresponding folder on my desktop and opened the file in my reader:

Screenshot - 11062013 - 05:33:45 PM

This document is titled “Nebraska Information Analysis Center,” another fusion center which it just so happens is missing a document from the fusion center association website. Where metadata plays in, and why I had missed this by manually “googling” until now, is in how FOCA searches for documents – by file name which is in the metadata of the document which gives its file path on the machine that stores it, its uri– something you can sometimes do by typing inurl:[term] into Google, but then you would have to know the exact name of the file to get relevant results. The name of this file is “Delaware-Valley-Intelligence-Center-Privacy-PolicyMar-2013.” It would have been very difficult to come up with this by educated accident.

Screenshot - 11062013 - 05:11:50 PM

So while there are still serious questions about the date gap between beginning a “cell” and submitting a policy, and concerns about a lack of full time privacy officer among others, it seems that everyone that was sure that a policy was completed and was approved by the DHS was quite correct, and I’d like to thank them for adding accurate memory to their graciously-given time to discuss the subject. It seems that a March draft was labeled somewhere in its life as the Nebraska Information Analysis Center’s policy perhaps at the National Fusion Center Associate website, where the “comprehensive” list is found, by whomever didn’t link it to the analysis center website.

This is only one elucidation among many from recent developments, the fruits of fresh approaches, and as mentioned, more documents to parse. Read the Declaration

Big Data Used Against the Poor for More Precise Discrimination, “a form of data redlining.”

Data analytics, “Big Data,” is being more widely implemented in more fields of public life and commerce. As data-mining tools are honed and expanded, the information they provide can be put to dubious purpose.  In the MIT Tech Review

Data analytics are being used to implement a subtle form of discrimination, while anonymous data sets can be mined to reveal health data and other private information, a Microsoft researcher warned this morning at MIT Technology Review’s EmTech conference.

Kate Crawford, principal researcher at Microsoft Research, argued that these problems could be addressed with new legal approaches to the use of personal data.

In a new paper, she and a colleague propose a system of “due process” that would give people more legal rights to understand how data analytics are used in determinations made against them, such as denial of health insurance or a job. “It’s the very start of a conversation about how to do this better,” Crawford, who is also a visiting professor at the MIT Center for Civic Media, said in an interview before the event. “People think ‘big data’ avoids the problem of discrimination, because you are dealing with big data sets, but in fact big data is being used for more and more precise forms of discrimination—a form of data redlining.”

Kate Crawford speaking today at the EmTech conference at MIT. Photo MIT Technology Review
Kate Crawford speaking today at the EmTech conference at MIT. Photo MIT Technology Review

An Ethnography of the National Counter-Terrorism Center

A sociology grad student at UPenn has published a dissertation analyzing the internal culture of the Intelligence Community.  Ms. Nolan spent a year in the Radicalization and Extremist Messages Group of the National Counter Terrorism Center at Liberty Crossing, VA. Her dissertation explored the daily life of the counterterror analyst, the wider status inequities in the Intelligence Community, office humor at LX (as Liberty Crossing is abbreviated) and the prime output of Intel analysts:  written reports for politicians.  The full title:

wasd

INFORMATION SHARING AND COLLABORATION IN THE UNITED STATES
INTELLIGENCE COMMUNITY: AN ETHNOGRAPHIC STUDY OF THE NATIONAL
COUNTERTERRORISM CENTER
Bridget Rose Nolan
A DISSERTATION
in
Sociology
Presented to the Faculties of the University of Pennsylvania
in
Partial Fulfillment of the Requirements for the
Degree of Doctor of Philosophy
2013

The publication has been written about widely already in the press, but I have not seen it subject to critical review. I wanted to share the paper as I begin my review – it is a relatively long dissection at some 215 pages, uploaded here [Nolan_Dissertation].

An initial impression: while I’m sure it must be a mistake, as the paper’s author ought very well be expected to know the basic history and structure of the post-9/11 Intelligence Community, the dissertation begins with the 9/11 Report and its found defects in the US spy infrastructure.

from the text, “Background,” pg 3

The 9/11 Commission Report, formally known as the Final Report of the National Commission on Terrorist Attacks Upon the United States, is the official record and best-known analysis of the attacks. Of its many conclusions, a main one is that a primary cause of the attacks was that the 16 intelligence agencies had failed to share information in a proper and timely manner. These 16 agencies are listed below:

Central Intelligence Agency (CIA)
Federal Bureau of Investigation (FBI)
Defense Intelligence Agency (DIA, part of the Department of Defense [DOD])
National Security Agency (NSA)
Department of Homeland Security (DHS)
National Geospatial-Intelligence Agency (NGA)
Department of the Treasury
Drug Enforcement Administration (DEA)
Department of Energy
National Reconnaissance Office (NRO)
Department of State (INR [Bureau of Intelligence and Research])
United States Army Intelligence
United States Navy Intelligence
United States Air Force Intelligence
United States Marine Corps Intelligence
United States Coast Guard Intelligence

Before September 11th, each of these intelligence agencies functioned more or less independently. It was presumed agencies would share information when necessary to achieve the relevant mission and that interagency collaboration would occur as a natural outcome of any task in which more than one agency was or should be involved. In practice, however, isolation and even open hostility has characterized the relationships among intelligence agencies, with the CIA-FBI and CIA-NCTC relationships being two of the most contentious.

The Department of Homeland Security was not among the agencies subjected to scrutiny, as it was formed along with the new Office of the Director of National Intelligence as part of the Commission’s recommended reforms (absorbing many other agencies including cabinet-level entities, and the USCG Intelligence above).

Some choice excerpts, anecdotes from the section examining analyst humor:

The first example of Agency folklore I will discuss is the Hot Dog TDY story. This is a true story that occurred during my second summer as a Graduate Fellow at the Agency, so I was able to track down the actual story and reproduce it below. A real TDY (Tour of Duty) is essentially a business trip in the IC, and whenever an employee is traveling somewhere, a cable is written to document the trip. The Hot Dog TDY is a satirical cable someone wrote upon hearing a rumor that there is a hot dog vending machine in the basement of the Old Headquarters Building (OHB) at CIA. I have already described the oblique language of cables and the difficulty people have in deciphering them; cables about domestic TDYs are considerably easier to read, but still follow a certain pattern, rhythm, style (all capital letters, for instance), and, of course, acronyms (e.g. NFI, which means “no further information” or “not further identified”). Someone adapted the language of cables to describe this hot dog vending machine, and other analysts found it so funny that it still was making the email rounds years after it was written. Here is the “cable” in its entirety, from my field notes:

OBSERVATIONS OF OHB HOT DOG MACHINE
1. LOCATION: OHB APPROX. GF45 NEAR THE ELEVATORS AND GREEN
JACKET HIVE
2. APPEARANCE: STANDARD VENDING MACHINE APPEARANCE WITH
THE WORDS OSCAR MAYER AND A LARGE WIENER FEATURED ON
FRONT.
A. ALSO INCLUDES OBSERVATION WINDOW
3. FEATURES: CHOICE OF 3 WIENERS, STANDARD BUN
A. OSCAR MAYER WIENER
B. SOME GERMAN THING
C. PREMIUM WIENER WITH CHEESE CORE
4. COST:
A. OSCAR MAYER $2.00
B. THE GERMAN $2.50
C. PREMIUM CHEESE CORE $3.00
5. EXTRAS: KETCHUP, MUSTARD PACKETS FOR MANUAL ASSEMBLY. NO
RELISH
6. OPERATION: THE HOTDOG VENDING MACHINE APPEARED TO
OPERATE CORRECTLY SERVING A WARM WIENER ASSEMBLED WITH A
MOIST, SLIGHTLY HEATED BUN. MULTIPLE ROBOTIC MECHANISMS
WERE OBSERVED THROUGH THE OBSERVATION WINDOW. ROBOT 1
CAPTURED SELECTED WIENER (OSCAR MAYER) FROM THE WIENER BAY
AND POSITIONED WIENER IN FRONT OF ROBOT 2. ROBOT 2 WAS
LABELED WITH MANY HAZARD INDICATORS SUGGESTING IT TO HAVE A
HEATING FUNCTION (FIELD COMMENT: HEATING COULD BE POWERED
BY LASERS). ROBOT 2 THEN OPENED ITS PORT TO RECEIVE THE
WIENER. ROBOT 1 THEN INSERTED THE WIENER INTO ROBOT 2 FOR
HEATING. ROBOT 1 THEN PROCEEDED TO AGITATE THE WIENER IN AND
OUT OF ROBOT 2 UNTIL DONE—PRESUMABLY WHEN WIENER HAD
PLUMPED. ROBOT 1 THEN ASSEMBLED WIENER WITH A WARM BUN
(NFI) APPEARING TO REST IN A ROBOT 3. ROBOT 3 THEN DELIVERED
ASSEMBLED HOTDOG THROUGH THE SERVING PORT.
7. TASTE: WIENER HAD INDEED PLUMPED—CHARACTERISTIC SPLITTING
OF WIENER WAS OBSERVED. WIENER WAS THOROUGHLY COOKED AND
WARM TO THE CENTER. BUN REMAINED COOLER AND HAD NOT
BECOME SOGGY. QUALITY EQUALED THAT OF AVERAGE BALL PARK.

And:

Screenshot from 2013-10-08 17:13:53

The photos below are posted to cryptome, they show the NCTC/DNI campus in McLean, VA.

pict17

pict18

On the stuff terrorism is made of

I was thinking after my visit to the Fusion Center with my friend Mike (and being asked if we were terrorists while taking pictures of the honest to god police mannequin in the Oregon Avenue guard booth), and talking to Jo, about how incendiary my posts yesterday regarding a police officer possibly hurting our Miner might seem in this climate where arrests for ‘threatening’ facebook posts regularly occur. The post:

If a police officer were to shoot my dog, they’d have to add at the least a severely tased if not bullet-ridden 6’5″ man to the paperwork. There’s almost never a reason for this, barring a rabid Clifford.

I’m not taking anything back. I don’t own weapons and will not obtain one to protect my dog, I’m pointing out a common sense notion that I will be angry, and that real consequences should be expected, out of anyone’s control, when an officer of the law needlessly and willfully kills a person’s loved one.

So I guess what I’m saying is, please don’t raid my house and shoot my dog for saying I’d be infuriated if you shot my dog. Thanks.

Fort Meade cops pull over Wikileaks Truck for disseminating “Top Secret WikiLeaks Info”

Cool Revolution

Who knew that the WikiLeaks Truck was chock full of Top Secret Info? Just like it says in bold lettering on the side!

Clark Stoeckley was pulled over this evening while driving the WikiLeaks Truck–no connection to media group WikiLeaks–after a long day at Fort Meade covering whistleblower Bradley Manning’s trial. Stoeckley is a cartoonist who has published a book about Manning’s pre-trial hearings.

His tweets recounting his chat with the Fort Meade officers speak for themselves. It was Stoeckley who got pulled over, but the punchline is how someone pulled one over on these two cops for a laugh.

View original post 262 more words

Network Thinkers on Big Data Spying

While combing the Internet for research links the other night, specifically for text-mining and analysis software, I discovered Orgnet’s blog. Orgnet provides tech and consulting services for a variety of organizations and are long-time specialists in network analysis.

Inflow is their flagship product.

InFlowScreen formalinformal

The Network Thinker’s blog featured two posts in relation to incendiary press reports of NSA surveillance, especially data collection. I wanted to share them – excerpts below, followed by links to the full posts.

Network thinkers know that to effectively monitor a network, you don’t seek out the edge nodes, you find the central hubs and monitor them — through them you will have access to most of what is flowing through the net.  Below is a network map of the Autonomous Systems [AS] that form the backbone of the internet.  It is easy to find the central hubs in this network.  Load the 20,000+ nodes [each AS is represented by a node] and 48,000+ links [a data flow between two ASes is represented by a link] into asocial network analysis software program and have it run the Betweenness or Connector metric.  These two network metrics reveal how central any node is in keeping everything interconnected.  The hubs will be reveled by the network metrics.  In the diagram below the hubs are sized by their Connector score — the higher the score, the larger the node, and the more network paths flow through this node.  The colors are randomly assigned and have no meaning.

T N T — The Network Thinkers: Vacuuming the Internet.

We are trying to solve a societal problem by throwing technology at the problem.  We seem to do that with many problems these days.  Yet, technology can help us make sense of complex dynamics… if mixed with the social sciences.  The map above would have been real useful during the early months of 2001.  The network layout algorithms in the software allows us to see emergent structures, including key nodes and clusters, in this human network.  Once we have the map, we can measure the network, to find which nodes/links keep the network together.

Would the intelligence community have taken this map seriously in 2001?  Would other agencies have ignored the map because “it was not invented here”?  Good data and good analysis is not often utilized correctly when it moves from one organization/context to another.  Would the ABC agency know what to do with the analysis from the XYZ agency?

Big data implies with more, you get certainty.  Instead of certainty you might get the opposite because “the more” might actually include noise or dirt.  Noisy/dirty data, with the appearance of certainty/accuracy is the worst case scenario.  Good analysis requires good data, but big data alone is not sufficient for complex analysis of events that have not yet happened.  We need to be careful what and whom gets caught in our nets of surveillance.

This data was collected after the 9-11 attacks, and is a reasonable depiction of the AQ network in the USA before September 11, 2001.  Would this network map have stopped the 9-11 attacks?  

T N T — The Network Thinkers: Connecting the Calls.

 

Repost from @sabzbrach, needful thoughts on Intelligence, Information, and Power

From Joanne Michele’s tumblr

I wrote this weeks ago and let it go because I couldn’t be bothered to find citations or write an ending. However, I find it relevant again, so here it is, in raw, unedited glory. I have no idea where I thought I was going with it and am aware that publishing it now only adds to an existing conversation that my “I said this first!1!” assertions cannot hope to influence. Enjoy.Some time ago, Alexa O’Brien tweeted that there is a larger story behind the trial of Pfc. Bradley Manning, and it does not concern his sexuality or gender identity.

In the days since she wrote that, I’ve been reflecting on both my own framing of this case and the stories that others have chosen to tell. For some, Manning’s gender identity or sexual preference may be the key to their own involvement, and I respect that. For others it is his military service, or his connection to Wikileaks that is important to them, and that is fine as well. I commend whatever reasons anyone has for following this case, because it is vital that we bear witness.

A few weeks ago I found myself – by extension, not name – in a cable, as part of the cache that Manning is alleged to have leaked. I have, or thought I had, at least perused all of the cables in the three years since their initial release. At first my interest was only about Iran and issues regarding the post-election violence there or the State Department’s plans for the thousands of refugees who fled the subsequent crackdown on journalists, activists and intellectuals – the best and brightest, essentially. The sphere I operated in at the time of the Cablegate release was determined to ignore their impact, either because it threatened their work as knowledge brokers (see Stratfor or almost any geopolitical analyst) or because of some ideological drive to downplay what their release meant. “It’s nothing we don’t already know,” was the Cablegate-era’s “Don’t focus on the Boston bombing while hundreds die in Iraq every day.” I can at least say that my reaction then was that at least our work was validated – even if we already knew, at least the atrocities were in black and white.

I’ve since gone over the cables dozens of times, looking for references to the MEK or for a laugh at Berlusconi and Putin’s “bromance,” or for a hundred other things. They have been credited with helping to spark the Arab Spring, with embarrassing the U.S., as justification for politically-motivated charges against Julian Assange, and have been endlessly helpful to journalists, analysts, and people just trying to understand.

While in college I worked at a retail shipping franchise. One of my favorite customers moved to China for a year and planned to ship a myriad of belongings, including their son’s high school advanced placement textbooks. International shipping is fun, to me, and I enjoyed getting the 12 or so huge boxes together with their Customs documents and generally using it as an excuse to focus on one thing rather than the petty annoyances that come with working retail. A few weeks after they left, we shipped these boxes. To make it short, although there were 12 boxes, it was considered for Customs purposes to be one shipment. They shared the same paperwork and moved together, and if there was a problem with one box, it was a problem for the entire thing. Well, of course there was a problem, and this is that China would not accept these books. It wasn’t even necessarily the content (which was all over the place – Math, Science, the Italian Renaissance, I think), but the quantity. China restricts shipments of books over a certain number, to keep out subversive literature and Christian missionaries. Everything had to come back, and I learned then the power of information. Countries such as Italy restrict the import or export of clothing, because textiles are a foundation of their economy. Mexico puts restrictions on photographs in an effort to keep child pornography out of the country. Customers could ship all kinds of things to their enlisted family members with APO addresses in the Middle East, but footballs were forbidden to civilian addresses because they are, literally pig-skin. But China….China restricted books, and I fought for months to keep them from destroying the clothing and family possessions along with these dangerous books.

Whoever has possession of information has incredible power – this is how Fred Burton of Stratfor is able to make an income.

Contractors have access to classified secrets, and not always because they need them to do their jobs, but because of the privilege offered by their positions as contractors for the government. Money, or the need to make it, offers access to information that the government has deemed secret and necessary to protect for national security and other reasons often cited by opponents of Manning.

Part of what Bradley Manning exposed is that classification and the stranglehold on classified information is largely petty and superficial.

An unprecedented number of individuals have top-secret clearance; these are not CIA agents, military commanders or drone pilots, but private contractors, who are paid handsomely for outsourced government contracts and granted access to not only the documents but the mechanisms for secrecy and hierarchy of information access.

Even the Department of Homeland Security has recently acknowledged, at least internally, that over-classification is prohibitive and is working to assign lower-level (FO/UO) categories to de-prioritize what is usually mundane information.

What Bradley Manning did was liberate information, and that is what was so dangerous to those who hold the keys to it. He will likely be in military prison for the rest of his life. Aaron Swartz took his own life earlier this year after being the target of relentless prosecution by federal prosecutor Carmen Ortiz, who is notorious for her overreach as well as her political aspirations. Barrett Brown, a journalist often associated with Anonymous, is facing over 100 years in prison, in part because he is alleged to have shared not information itself (hundreds of credit card numbers obtained during a hack of Stratfor, which was itself done under the watchful eye of the FBI through an informant), but a link to a chatroom that itself had a link to this information. Brokering information, whether it be scientific research, State Department cables, credit card numbers or access to them, is profitable for the likes of Fred Burton or Aaron Barr but seemingly deemed criminal when given up for free.

Prosecuting these men and classifying information dissemination as criminal has the effect of deterring whistle-blowers and intimidating journalists. Even if the charges against Brown are dropped or Manning is somehow freed before his death, the message sent is of zero tolerance. Dissidents and journalists are not protected by mainstream media empires, but even the largest ones kowtow to the administration on issues of national security and terrorism. When the Wall Street Journal was presented with (uncovered?) evidence of U.S. drone bases in Saudi Arabia, the publication sat on the story at the behest of the Obama administration until it was deemed appropriate to publish. Reuters had repeatedly requested the now-infamous Collateral Murder video but were denied again and again, only finally seeing the footage when it was published by Wikileaks in 2010.

On Barrett Brown, Charles Swift said: “There is an established tradition where journalists can publish information, classified or not.”

If contractors are operating like government agencies, we need to establish the same rights of access and publication for their information as journalists have traditionally been granted for government agencies.

A Note on Occupy Homecoming

This might come off as more critical than it really is – I think it is important for those who feel dedicated to a particular cause to maintain inertia through exercises in their cause’s modalities. I’m not discouraging anyone from attending the Occupy Homecoming June 1 in Manhattan, although I myself cannot make it. But I would like to make a suggestion about this posting which makes enthusiastic recommendation difficult: for all practical purposes I have no idea what

“Sleep Cell Convergence! Liberty Plaza is the home of Occupy Wall Street, but we express our power by direct action, and sleep on the streets as a political action to expose the corrosion that is corrupting our world. Our targets may consistently change, but we can always be found here first as we choose our actions of resistance on a nightly basis!”

means. I vaguely recognize its rhetorical ancestors, and it certainly shares the patent language of Adbusters. I am interested in the event’s politics and of course feel a stake in the Occupy ‘brand’ (if you’ll pardon the phrase), however in order to extract real world meaning I believe language like ‘we express our power by direct action, and sleep on the streets as a political action to expose the corrosion that is corrupting our world’ must stand quite a bit of decompression, in fact thorough explication, to which it has been and continues to be subjected, the product of which you can find in *books.

However, I am very skeptical anyone can really tell me what the above means or that they really believe someone unfamiliar with the event would have any idea what to expect from it, or visualize its practical target. It seems to me that this disconnect may be endemic and symptomatic of a sort of ideological and strategic apoptosis.