However, the pre-crime concept is coming very soon to the world of Human Resources (HR) and employee management.
A Santa Barbara, Calif., startup called Social Intelligence data-mines the social networks to help companies decide if they really want to hire you.
While background checks, which mainly look for a criminal record, and even credit checks have become more common, Social Intelligence is the first company that I'm aware of that systematically trolls social networks for evidence of bad character.
Using automation software that slogs through Facebook, Twitter, Flickr, YouTube, LinkedIn, blogs, and "thousands of other sources," the company develops a report on the "real you" -- not the carefully crafted you in your resume. The service is called Social Intelligence Hiring. The company promises a 48-hour turn-around.
Because it's illegal to consider race, religion, age, sexual orientation and other factors, the company doesn't include that information in its reports. Humans review the reports to eliminate false positives. And the company uses only publically shared data -- it doesn't "friend" targets to get private posts, for example.
The reports feature a visual snapshot of what kind of person you are, evaluating you in categories like "Poor Judgment," "Gangs," "Drugs and Drug Lingo" and "Demonstrating Potentially Violent Behavior." The company mines for rich nuggets of raw sewage in the form of racy photos, unguarded commentary about drugs and alcohol and much more.
The company also offers a separate Social Intelligence Monitoring service to watch the personal activity of existing employees on an ongoing basis. The service is advertised as a way to enforce company social media policies, but given that criteria are company-defined, it's not clear whether it's possible to monitor personal activity.
The service provides real-time notification alerts, so presumably the moment your old college buddy tags an old photo of you naked, drunk and armed on Facebook, the boss gets a text message with a link.
Two aspects of this are worth noting. First, company spokespeople emphasize liability. What happens if one of your employees freaks out, comes to work and starts threatening coworkers with a samurai sword? You'll be held responsible because all of the signs of such behavior were clear for all to see on public Facebook pages. That's why you should scan every prospective hire and run continued scans on every existing employee.
In other words, they make the case that now that people use social networks, companies will be expected (by shareholders, etc.) to monitor those services and protect the company from lawsuits, damage to reputation, and other harm. And they're probably right.
Second, the company provides reporting that deemphasizes specific actions and emphasizes character. It's less about "what did the employee do" and more about "what kind of person is this employee?"
Because, again, the goal isn't punishment for past behavior but protection of the company from future behavior.
It's all about the future.
A Cambridge, Mass., company called Recorded Future, which is funded by both Google and the CIA, claims to use its "temporal analytics engine" to predict future events and activities by companies and individual people.
Like Social Intelligence, Recorded Future uses proprietary software to scan all kinds of public web sites, then use some kind of magic pixie dust to find both invisible logical linkages (as opposed to HTML hyperlinks) that lead to likely outcomes. Plug in your search criteria, and the results come in the form of surprisingly accurate future predictions.
Recorded Future is only one of many new approaches to predictive analytics expected to emerge over the next year or two. The ability to crunch data to predict future outcomes will be used increasingly to estimate traffic jams, public unrest, and stock performance. But it will also be used to predict the behavior of employees.
Google revealed last year, for example, that it is developing a search algorithm that can accurately predict which of its employees are most likely to quit. It's based on a predictive analysis of things like employee reviews and salary histories. They simply turn the software loose on personnel records, then the system spits out a list of the people who are probably going to resign soon. (Im imagining the results laser-etched on colored wooden balls.)
One of the ways around the issues of security and control that make some businesses wary of cloud computing is to build a private cloud -- one that remains within the corporate firewall and is wholly controlled internally. Private clouds also increase the agility of IT an organization's IT infrastructure and make it easier to roll out new technology projects. Download this eBook to get the facts behind the private cloud and learn how your organization can get started.