Technological enforcement

Ha’aretz reports on a new Israeli startup that has developed technology to detect terrorists:

Quietly, even stealthily, this unknown company has been working for five years now on one of the more interesting technological innovations to be created in these parts.WeCU (“We see you,” in case you are unaccustomed to SMS-speak) promises an automated system to detect people with mayhem on their minds. The system integrates methods and doctrines from the behavioral sciences with biometric sensors.

According to the company’s founders, in under a minute it can screen an individual, without his or her knowledge or cooperation and without interfering with routine activities, and disclose intentions to carry out criminal or terror activity. It can identify subjects who are not carrying any suspicious objects, do not demonstrate any suspicious behavior, do not fit into a predefined social or other profile and do not arouse any suspicion.

Unlike systems currently in use, such as polygraphs or biometric systems based on identifying an individual under emotional pressure, WeCU does not attempt to determine whether the subject is lying, concealing information, under stress or feeling guilty. Instead, it seeks to identify concealed intentions by uncovering an associative connection between the subjects and defined threats.

What isn’t clear is if this system would be deployed in public spaces or if it would utilized in interrogations. On the one hand the company’s saying that it can screen suspects “without interfering with routine activities” on the other hand the CEO goes on to explain:

How does it work? Givon explains: “The technology is patented. We take advantage of human characteristics, according to which when a person intends to carry out a particular activity or has a great acquaintance or involvement with a particular activity, he carries with him information and feelings that are associated with the subject or activity. In effect, his brain creates a collection of associations that are relevant to the subject.”When this person is exposed to stimuli targeted at these associations – such as a picture of a partner to the activity, items from the scene of a crime that he carried out, the symbol of the organization in whose name he is acting or a code word – he will respond emotionally and cognitively to these stimuli. The response is expressed with a number of very subtle physiological and behavioral changes during the exposure to the stimulus,” Givon said.

If the system exposes the subject to “stimuli” how does it do that effectively unless it’s in a controlled environment (such as an interrogation)?

The Guardian’s technology blog wonders (where I first saw the story):

There’s not much info on WeCU Technologies Ltd, but it is a Microsoft Partner and was “incorporated in August 2003”. The partner page has a summary of the approach, but the link to its web page doesn’t work.

While there are some questions about WeCU, one of the founders of the company, Prof. Shlomo Breznitz was previously involved with another startup, Mindfit Technologies that also focuses on cognition.

The trial was conducted at the Tel-Aviv Sourasky Medical Center of Tel-Aviv University in Israel, where researchers are taking a leading role in the study of age-related disorders. During the two-year clinical trial, doctors conducted a prospective, randomized, double-blind study with active comparators of 121 self-referred volunteer participants age 50 and older. Each study participant was randomly assigned to spend 30 minutes, three times a week during the course of three months at home, using either MindFit or sophisticated computer games.While all study participants benefited from the use of computer games, MindFit users experienced greater improvement in the cognitive domains of spatial short term memory, visuo-spatial learning and focused attention. Additionally, MindFit users in the study with lower baseline cognitive performance gained more than those with normal cognition, showing the potential therapeutic effect of home-based computer training software in those already suffering the effects of aging or more serious diseases.

“These research findings show unequivocally that MindFit, which requires no previous computer experience of users, keeps minds sharper than other computer games and software can,” said Prof. Shlomo Breznitz, Ph.D., founder and president of CogniFit. “In fact, the same cognitive domains that MindFit keeps sharp are also central in most daily activities-including driving-that enable aging independently.”

Breznitz continued, “These findings support CogniFit’s belief that if you exercise your brain just as you do your muscles, you can build the speed and accuracy of your mental functions, significantly. ‘Working out’ with MindFit three times a week from the comfort of your home will yield similar results for your brain as exercising at the gym with that same frequency does for your muscles.”

In unrelated news Prof Breznitz was saved from the Holocaust by being hidden at a Catholic orphanage.

In somewhat related news Japan is considering fielding a facial recognition device to determine if a person buying cigarettes from a vending is of legal age (20 in Japan) to do so.

Cigarette vending machines in Japan may soon start counting wrinkles, crow’s feet and skin sags to see if the customer is old enough to smoke.The legal age for smoking in Japan is 20 and as the country’s 570,000 tobacco vending machines prepare for a July regulation requiring them to ensure buyers are not underage, a company has developed a system to identify age by studying facial features.

By having the customer look into a digital camera attached to the machine, Fujitaka Co’s system will compare facial characteristics, such as wrinkles surrounding the eyes, bone structure and skin sags, to the facial data of over 100,000 people, Hajime Yamamoto, a company spokesman said.

And infra-red sensors give the Air Force an opportunity to detect and eliminate a threat to soldiers on the ground.

The sniper never knew what hit him. The Marines patrolling the street below were taking fire, but did not have a clear shot at the third-story window that the sniper was shooting from. They were pinned down and called for reinforcements.Help came from a Predator drone circling the skies 20 miles away. As the unmanned plane closed in, the infrared camera underneath its nose picked up the muzzle flashes from the window. The sniper was still firing when the Predator’s 100-pound Hellfire missile came through the window and eliminated the threat.

The airman who fired that missile was 8,000 miles away, here at Creech Air Force Base, home of the 432nd air wing. The 432nd officially “stood up,” in the jargon of the Air Force, on May 1, 2007. One year later, two dozen of its drones patrol the skies over Iraq and Afghanistan every hour of every day. And almost all of them are flown by two-man crews sitting in the air-conditioned comfort of a “ground control station” (GCS) in the Nevada desert.

I suppose that there are those who will see in this increased use of video and recognition technology a manifestation of Big Brother. In the limited use they’ve been deployed so far, it doesn’t seem that governments are getting too intrusive. Then again maybe it would be reassuring if governments limited this technology for really important stuff instead of (unsuccessfully) saturating society with it.

Crossposted on Soccer Dad.

About Soccerdad

I'm a government bureaucrat with delusions of literacy.
This entry was posted in Israel. Bookmark the permalink.

5 Responses to Technological enforcement

  1. Eric J says:

    Is it just me, or does this sound suspiciously like Dr. Baltar’s Cylon Detector?

    Don’t let them have any nuclear weapons.

  2. Tatterdemalian says:

    I wouldn’t mind society being saturated, as long as civilians are allowed to own it too. “Big Brother” happens when the watchers hold themselves above being watched themselves as is sadly the case in the UK, where police-monitored cameras are ubiquitous but civilian-monitored cameras are either illegal or will get your butt kicked by angry cops if they catch you filming them.

  3. One of the most interesting government-surveillance scenarios I ever read is in Greg Bear’s novel Queen of Angels.

    In it, a government agency, “Public Oversight” has monitoring devices installed everywhere, including places we’d consider private. Freedom is maintained by imposing extreme restrictions on what Public Oversight can do with the data they collect. They release all kinds of information in aggregates, to do what we, today, do with surveys and focus groups. Any specific information, however, is only released on a need-to-know basis, and they decide who needs to know (and almost nobody has such a need).

    In the novel, a police officer requests information about a suspect. Public Oversight refuses to disclose the information, but provides some other, seemingly trivial, piece of information, saying that normal police work can produce all the remaining evidence needed to solve the case.

    A very interesting system, even though we all know it could never exist in reality without being completely abused by a government that would use it to become a tyranny.

  4. Jeff says:

    WeCU reminds me a little bit of Bruce Schneier’s saying security by obscurity. Trust us it works and you do not need to know how. Those types of systems are always hacked.

  5. Maquis says:

    I think we need to incorporate facial recognition technology from digital cameras into electronic gunsights. I mean, you never know when the zombies are coming, and a non-head shot is a wasted bullet!

Comments are closed.