To The Bristol Cable on Face Recognition
This is a copy of a letter I wrote to the excellent Bristol Cable in reply to an interview they ran on facial recognition. It was eventually published as an article here.
To the Editor
I read your interview with Lyndon Smith and Wenhao Zhang of the Bristol Robotics Laboratory with a keen interest.
Clearly Smith and Zhang are at the cutting edge of their field, and their work on 3D machine vision is something that as an engineer (albeit in a different field) and former student of the University of Bristol, I can really appreciate as an academic achievement. I would not wish to end my letter without acknowledging this point, or let the rest of it detract from their work.
I was extremely happy to see your reporter asked explicitly about whether the researchers think about the potential for the misuse of their research. This is something which all too often is completely missed when journalists report on new research. I believe it to be especially important with regard to any sort of machine intelligence or automated surveillance.
While it is heartening to see that Smith and Zhang have clearly thought about how people might perceive this technology, I am afraid that I am much more sceptical about how it will be used than their comments suggest they are.
Their comments about how fast their system is suggests that it is optimised to be used in situations where there are many people in the scene being analysed, such as public places and transport hubs. While the 'secure lock' example they give is a reasonable application, I feel like it would be an over-engineered solution to a problem that does not exist. One does not need a throughput of 100 recognitions per second (extrapolating from their figure of 10ms per face) to unlock a door or access an ATM.
Let me be clear, a system with extremely accurate recognition of peoples faces is a tool to be used for surveillance and 'security' - by whatever definition one finds most palatable. Even if it is not explicitly deployed in a crime-prevention capacity (like with the ticketing example they suggest), all of the data such a system would gather could still be made available to law enforcement. It need not even be surveillance by the state. Advertising firms use facial recognition techniques (though not necessarily those developed by the BRL) to track people in public places who look at their marketing campaigns.
Smith says that they envisage this as a "tool to help people if they want to use it" and that "we're not going to force people if they don't want to". This strikes me as naive beyond reason. If deployed across a public transport network, the only way to opt out of such a system completely would be to not use that service. Which rather defeats the object of a public service! If such a system were used to track the success of real-world marketing campaigns, how would I opt out then? Stay inside? I cannot opt-out of CCTV at the moment, and I don't see how this system, if deployed widely, would be any different.
I wonder what a train conductor might think of the suggestion that the system "could be an enabler of a vast reduction of labour associated with these things"?
My point is not that Smith and Zhang are wrong to develop this technology. Nor is it a reflection on them and the work undertaken by the BRL. My point is to take issue with when engineers develop technologies without fully and honestly acknowledging the societal implications of their work. Obviously one cannot discern the future, but I think it is reasonable to say that a technology which recognises faces might be used to recognise faces for more than making train tickets more efficient!
As engineers, we build tools which can be used for good and for ill. Most engineers, particularly academics, have very little control over the final use cases for their work. This is important to remember, but I do not think it completely insulates them from a responsibility to consider how it affects the world.
Academia is a system which rewards grant money and publications in prestigious journals. Not invention for social good. If as engineers we continue to delude ourselves into thinking we can abdicate complete responsibility for how our work affects society, then we have truly missed what it means to be part of a society, and in my opinion, the point of being an engineer.
Yours Sincerely, Ben Marshall
< Back | August 2017