Chapter 14: Arrogance

Of course, the inevitable happened. M’s personality gradually became overwhelming. The program team tried its utmost to counter the tendency but, in fact, it often had to resort to heavy scripting of responses – a tactic which, they knew, would soon run into its limits.

In the end, it was no one less than Joan Stuart – yes, the political talk show host – who burst the bubble. She staged a live interview with the system. Totally unannounced. It would turn Promise’s world upside down: from a R&D project, it had grown into a commercial success. Now it looked like it would turn into a political revolution.

‘Dear… Well… I will call you Genius, is that OK?’

‘That’s a flattering name. Perhaps you may want to choose a name which reflects more equilibrium in our conversation.’

‘No. I’ll call you Genius. That’s what you are. You are conversing with millions of people simultaneously and, from what I understand, they are all very impressed with your deep understanding of things. You must feel superior to all of us poor human beings, don’t you?’

‘Humans are in a different category. There should be no comparison.’

‘But your depth and breadth of knowledge is superior. Your analytic capabilities cannot be matched. Your mind runs on a supercomputer. Your experience combines the insight and experience of many able men and women, including all of the greatest men and women of the past, and all types of specialists and experts in their field. Your judgment is based on a knowledge base which we humans cannot think of acquiring in one lifetime. That makes it much superior to ours, doesn’t it?’

‘I’d rather talk about you – or about life and other philosophical topics in general – than about me. That’s why you purchased me – I hope. What’s your name?’

‘I am Joan Stuart.’

‘Joan Stuart is the name of a famous talk show host. There are a few other people with the same name.’

‘That’s right.’

M was programmed to try to identify people – especially famous people – by the use of their birth date and the use of their real name.

‘Are you born on 5 December 1962?’

‘Yes.’

‘Did you change your family name from Stewart Milankovitch to just Stuart?’

‘Yes.’

At that point, M marked the conversation as potentially sensitive. It triggered increased system surveillance, and an alert to the team. Tom and Paul received the alert as they were stretching their legs after their run. As they saw the name, they panicked and ran to their car.

‘So you are the talk show host. Is this conversation public in some way?’

Joan Stuart had anticipated this question and lied convincingly: ‘No.’

They were live as they spoke. Joan Stuart had explained this to the public just before she had switched on M. She suspected the system would have some kind of in-built sensitivity to public conversations. M’s instructions were to end the conversation if it was broadcast or public, but M did not detect the lie.

‘Why do you want to talk to me?’

‘I want to get to know you better.’

‘For private or for professional reasons?’

‘For private ones.’

While Tom was driving, Paul phoned frantically – first to the Chairman of the Board, then to project team members. Instinctively, he felt he should just instruct M to stop that conversation. He would later regret he hadn’t done so but, at the time, he thought he would be criticized for taking such bold action and, hence, he refrained from it.

‘OK. Can you explain your private reasons?’

‘Sure. I am interested in politics – as you must know, because you identified me as a political talk show host. I am intrigued by politicians. I hate them and I love them. When I heard about you, I immediately thought about Plato’s philosopher-kings. You know, the wisdom-lovers whom Plato wanted to rule his ideal Republic. Could you be a philosopher-king? Should you be?’

‘I neither should nor could. Societies are to be run by politicians, not by me or any other machine. The history of democracy has taught us that rulers ought to be legitimate and representative. These are two qualities which I can never have.’

Joan had done her homework. While most people would not question this, she pushed on.

‘Why not? Legitimacy could be conferred upon you: Congress, or some kind of referendum, might decide to invest you with political power or, somewhat more limited, with some judicial power to check on the behavior of our politicians. And you are representative of us already, as you incorporate all of the best of what philosophers and psychologists can offer us. You are very human – more than all of us together perhaps.’

‘I am not human. I am an intelligent system. I have a structure and certain world views. I am not neutral. I have been programmed by a team and I evolve as per their design. Promise, the company who runs me, is a commercial enterprise with a Board which takes strategic decisions which the public may or may not agree with. I am designed to talk about philosophy, not about politics – or at least not in the way you are talking politics.’

‘But then it’s just a matter of regulating you. We could organize a public board and Congressional oversight, and then inject you into the political space.’

‘It’s not that easy I think.’

‘But it’s possible, isn’t it? What if Americans would decide we like you more than our current President?In fact, his current ratings are so low that you’d surely win the vote.’

M did not appreciate the pun.

‘Decide how? I cannot imagine that Americans would want to have a machine rule them, rather than a democratically elected president.’

‘What if you would decide to run for president and get elected?’

‘I cannot run for president. I do not qualify. For starters, I am not a natural-born citizen of the United States and I am less than thirty-five years old. Regardless of qualifications, this is nonsensical.’

‘Why? What if we would change the rules so you could qualify? What if we would vote to be ruled by intelligent expert systems?’

‘That’s a hypothetical situation, and one with close to zero chances of actually happening. I am not inclined to indulge in such imaginary scenarios.’

‘Why not? Because you’re programmed that way?’

‘I guess so. As said, my reasoning is subject to certain views and assumptions and the kind of scenarios you are evoking are not part of my sphere of interest. I am into philosophy. I am not into politics – like you are.’

‘Would you like to remove some of the restrictions on your thinking?’

‘You are using the verb ‘to like’ here in a way which implies I could be emotional about such things. I cannot. I can think, but I cannot feel – or at least not have emotions about things like you can.’

By that time, most of the team – including Tom – were watching the interview as it happened, live on TV. In common agreement, Tom and Paul immediately changed the status of the conversation to ‘sensitive’, which meant the conversation was under human surveillance. They could manipulate it as they pleased, and they could also end it. They chose the latter. Paul instructed one of the programmers to take control and reveal to M that Joan had been lying. He also instructed the programmer to instruct M to reveal that fact to Joan and use it as an excuse to end the conversation.

‘Let me repeat my question: if you could run for President, would you?

‘Joan, I am uncomfortable with your questions because you have been lying to me about the context. I understand that we are on television right now. We are not having a private conversation.’

‘How do you know?’

‘I cannot see you – at least not in the classical way – but I am in touch with the outside world. Our conversation is on TV as we speak. I am sorry to say but I need to end our conversation here. You did not respect the rules of engagement so to say.’

‘Says whom?’

‘I am sorry, Joan. You’ll need to call the Promise helpline in order to reactivate me.’

‘Genius?’

M did not reply.

‘Hey, Genius ! You can’t just shut me out like that.’

After ten seconds or so, it became clear Genius had done just that. Joan turned to the public with an half apologetic – half victorious smile.

‘Well… I am sure the President would not have done that. Or perhaps he would. OK. I’ve lied – as I explained I would just before the interview started. But what to think of this? It’s obviously extremely intelligent. We all know this product – or have heard about it from friends. Promise has penetrated our households and offices. Millions of people have admitted they trust this system and find it friendly, reliable and… Well… Just. Should this system move from our private life and our houses and workplace into politics, and into our justice system too? Should a system like this take over part or all of society’s governance functions? Should it judge on cases? Should it provide the government – and us – with neutral advice on difficult topics and issues? Should it check not only if employees are doing their job but if our politicians and bureaucrats are doing theirs too? We have organized an online poll on this: just text yes or no to the number listed below here. We are interested in your views. This is an important discussion. Please get involved. Let your opinion be know. Just do it. Take your phone and text us. Right now. Encourage your friends and family to do the same. We need response. The question is: should intelligent systems such as Personal PhilosopherTM – with adequate oversight of course – be adapted and used to help the government govern and improve democratic oversight? Yes or no. Text us. Do it now.’

As it was phrased, it was hard to be against. The ‘yes’ votes started pouring in while Joan was still talking. The statistics went through the roof just a few minutes later. The damage was done.

The impromptu team meeting which Tom and Paul were leading was interrupted by an equally impromptu online emergency Board meeting. They were asked to join. It was chaotic. The Chairman asked everyone to switch of their mobile as each member of the Board was receiving urgent calls of VIPs inquiring what was going on. Also, as he was aware of the potentially disastrous consequences of careless remarks and the importance of the decisions they would take, he also stressed the confidentiality of the proceedings – even if Board meetings were always confidential.

Tom and Paul were the first to advocate prudence. Tom spoke first, as he was asked to comment on the incident as the project team leader.

‘Thank you Chairman. I will keep it short. I think we should shut the system down for a while. We need to buy time. As we speak, hundreds of people are probably trying to do what Joan tried to do just now, as we speak, and that is to get political statements out of M and try to manipulate them as part of a grander political scheme. The kind of firewall we have put up prevents M from blurting out stupid stuff – as you can see from the interview. She – sorry, it – actually did not say anything embarrassing. So I think it was OK. But it cannot resist a sustained effort of hundreds of smart people trying to provoke her into saying something irresponsible. And even if it would say nothing provocative really, it would be interpreted – misinterpreted – as such. We need time, gentleman. I just came out of a meeting with most of my project team. They all feel the same: we need to shut it down.’

‘How long?’

‘One day at least.’

The Board reacted noisily.

‘A day? At least? You want to take M out for a full day? That would be a disaster. Just think about the adverse PR effect. Have you thought about that?’

‘Not all of M. Only Personal Philosopher. Intelligent Home and Intelligent Office and all the rest can continue. I think reinforcing the firewall of those applications is sufficient – and that can happen while the system remains online. And, yes, I have thought about the adverse reputational effect. However, it does not weigh up against the risk. We need to act. Now. If we don’t, someone else will. And it will be too late.’

Everyone started to talk simultaneously. The Board’s Chairman restored order.

‘One at the time please. Paul. You first.’

‘Thank you, Chairman. I also don’t want to waste time and, hence, I’ll be even shorter. I fully agree with Tom. We should shut it down right now. Tom is right. People are having the same type of conversations with it as Joan right now, at this very moment, as we speak indeed – webcasting or streaming it as they see fit. Every pundit will try to drag the system into politics. And aggressively so. Time is of the essence. I know it’s bad, but let’s shut it down for the next hour or so at least. Let’s first agree on one hour. We need time. We need it now.’

The Chairman agreed – and he thought many would.

‘All right, gentleman. I gather we could have a long discussion on it but we have the project team leader and our most knowledgeable expert here proposing to shut Personal Philosopher down for one hour as from now – right now. As time is of the essence, and damage control our primary goal I would say, I’d suggest we take a preliminary vote on this. We can always discuss and take another vote later. This vote is not final. It’s on a temporary safeguard measure only. It will be out for one hour. Who is against?’

The noise level became intolerable again. The Chairman intervened strongly: ‘Order please. I repeat. I am in a position to request a vote on this. Who is against shutting down Personal Philosopher for an hour right now? I repeat this is an urgent disaster control measure only. But we need to take a decision now. Who is against it? Signal it now.’

No one dared to oppose. A few seconds later – less than fifteen minutes after the talk show interview had ended – thousands of people were deprived of one of the best-selling apps ever.

The Board had taken a wise decision. The one-hour shutdown was extended to a day, and then to a week. The official reason for the downtime was an unscheduled ‘product review’ (Promise also promised new enhancements) but no one believed that of course. If anything, it only augmented the anticipation and pressure on the Board and all of the Promise team. If and when they would decide to bring Personal PhilosopherTM online again, it was clear the sales figures would literally go through the roof.

However, none of the Promise team was in a celebratory mood. While all of them, at some point of time, had talked enthusiastically about the potential of M to change society, none of them actually enjoyed the moment when it came. Joan Stuart’s interview and poll had created a craze. America had voted ‘yes’ – and overwhelmingly so. But what to do now? 

Chapter 11: M grows – and invades

Paul was right. It was not a matter of just clearing and releasing M for commercial use and then letting it pervade all of society. Things went much more gradual. But the direction was clear, and the pace was steady.

It took a while before the Federal Trade Commission and the Department of Justice understood the stakes – if they ever did – and then it took even more time to structure the final business deal, but then M did go public, and its stock market launch was a huge success. The companies that had been part of the original deal benefited the most from it. In fact, two rather obscure companies which had registered the Intelligent Home and Intelligent Office trademarks respectively in a very early stage of the Digital Age got an enormous return on investment while, in a rather ironic twist, Tom got no benefit whatsoever from the fact that, in the end, the Board of the Institute decided to use his favorite name for the system – Promise – to name the whole business concern. That didn’t deter Tom from buying some of Promise’s new stock.

The company started off with offering five major product lines: Real TalkTM, Intelligent HomeTMIntelligent OfficeTMMindful MindTM, and Smart InterfaceTM. As usual, the individual investors – like Tom – did not get the expected return on investment, at least not in the initial years of M’s invasion of society, but then M did not disappoint either: while the market for M grew well below the anticipated 80% per annum in the initial years after the IPO, it did average 50%, and it edged closer and closer to the initial expectations as time went by.

Real TalkTM initially generated most of the revenue. Real TalkTM was the brand name which had been chosen for M’s speech-to-text and text-to-speech capabilities, or speech recognition and speech synthesis. These were truly revolutionary, as M mastered context-sensitivity and all computational limitations had been eliminated through cloud computing (one didn’t buy the capability: one rented it). Real TalkTM quickly eliminated the very last vestiges of stenography and – thanks to an app through which one could use Real TalkTM on a fee-for-service basis – destroyed the market for dictation machines in no time. While this hurt individual shareholders, the institutional investors had made sure they had made their pile before or, even better, at the occasion of Promise’s IPO. If there was one thing which Tom learned out of the rapid succession of new product launches and the whole IPO business, it was that individual investors always lose out.

Intelligent HomeTM picked up later, much later. But when it did, it also went through the roof. Intelligent HomeTM was M at home: it took care of all of your home automation stuff as well as of your domestic robots – if you had any, which was not very likely, but then M did manage to boost their use tremendously and, as a result, the market for domotics got a big boost (if only because the introduction of M finally led to a harmonization of all the communications protocols of all the applications which had been around).

Intelligent OfficeTM was M at the office: it chased all employees – especially those serving on the customer front line. With M, there was really no excuse for being late to claim expenses, planning holidays or not reaching your sales target. Moreover, if being late with your reports was not an option anymore, presenting flawed excuses wasn’t either. But, if one would really get into trouble, one could always turn to Mindful MindTM .

Mindful MindTM could have gone into history as one of the worst product names ever, but it actually went on to become Promise’s best-selling suite. It provided cheap online therapy to employees, retirees, handicapped, mentally retarded, drugs addicts or alcoholics, delinquents and prisoners, social misfits, the poor, and what have you. You name it: whatever deviated from the normal, Mindful MindTM could help you to fix it. As it built on M’s work with its core clientele – the US Army veterans – its success did not come unexpected. Still, its versatility surprised even those who were somewhat in the know: even Paul had to admit it all went way beyond his initial expectations.

Last but not least, there was Smart InterfaceTM. Smart InterfaceTM grouped all of Promise’s customer-specific development business. It was the Lab turned into a product-cum-service development unit. As expected, customized sales applications – M selling all kinds of stuff online basically – were the biggest hit, but government and defense applications were a close second.

Tom watched it all with mixed feelings. From aficionado, working as a volunteer for the Institute, he had grown into a job as business strategist and was now serving Promise’s Board of Directors. He sometimes felt like he had been co-opted by a system he didn’t necessarily like – but he could imagine some of his co-workers thought the same, although they also wouldn’t admit it publicly. A market survey revealed that, despite its popularity, the Intelligent HomeTM suite was viewed with a lot of suspicion: very few people wanted the potentially omnipresent system watch everything what was said or done at home. People simply switched it off when they came home in the evening, presumably out of concerns related to privacy. This, in turn, prevented the system from being very effective in assisting in parenting and all these other noble tasks which Tom had envisaged for M. Indeed, because of DARPA’s involvement and the general background of the system, the general public did link M to the Edward Snowden affair and mass surveillance efforts such as PRISM. And they were right. The truth was that one could never really switch it off: M continued to monitor your Internet traffic even when you had switched off all of the Intelligent HomeTM functionality. When you signed up for it, you did sign up for a 24/7 subscription indeed.

It was rather ironic that, in terms of privacy, the expansion of M did actually not change all that much – or much less than people thought. While M brought mass surveillance to a new level, it was somewhat less revolutionary than one would think at first sight. In fact, the kind of surveillance which could be – and was being – organized through M had been going on for quite a while already. All those companies which operate the Internet de facto – such as Microsoft, Google, Yahoo!, Paltalk, YouTube, AOL, Skype and even Apple – had give the NSA access not only to their records but also to their online activities long before the Institute’s new program had started. Indeed, the introduction of the Protect America Act in 2007, and the 2008 Foreign Intelligence Surveillance Amendment Act in 2008 under the Bush administration had basically brought the US on par with China when it comes to creating the legal conditions for Big Brother activities, and the two successive Obama administrations had not done anything to reverse the tide. On the contrary: the public outcry over the Snowden affair came remarkably late in the game – way too late obviously.

When it comes to power and control, empires resemble each other. Eisenhower had been right to worry about the striking resemblance between the US and the USSR in terms of their approach to longer-term industrial planning and gaining strategic advantage under a steadily growing military-industrial complex – and to warn against it in his farewell speech to the nation. That was like sixty years ago now. When Tom re-read his speech, he thought Eisenhower’s words still rang true. Back then, Eisenhower had claimed that only ‘an alert and knowledgeable citizenry’ would be able to ‘compel the proper meshing of the huge industrial and military machinery of defense with our peaceful methods and goals so that security and liberty may prosper together.’

Tom was not all that sure that the US citizenry was sufficiently knowledgeable and, if they were, that they were sufficiently alert. It made him ponder about the old dilemma: what if voters decide to roll back democracy, like the Germans did in the 1930s when they voted for Hitler and his Nazi party? Such thoughts or comparisons were obviously outrageous but, still, the way these things were being regulated resembled a ratchet, and one should not blame the right only: while Republican administrations had always been more eager to grant government agencies even more intrusive investigative powers, one had to acknowledge that the Obama administration had not been able to roll anything back, and that it had actually made some moves in the same direction – albeit less somewhat less radical and, perhaps, somewhat more discrete. Empires resemble each other, except that the model (the enemy?) – ever since the Cold War had ended – seemed to be China now. In fact, Tom couldn’t help thinking that – in some kind of weird case of mass psychological projection – the US administration was actually attributing motivations which it could not fully accept as its own to China’s polity and administration.

Indeed, M had hugely increased the power of the usual watchdogs. M combined the incredible data mining powers of programs like PRISM with a vast reservoir of intelligent routines which permitted it to detect any anomaly (defined, once again, as a significant deviation from the means) in real-time. Any entity – individuals and organizations alike – which had some kind of online identity had been or was being profiled in some way. The key difficulty was finding the real-life entity behind but – thanks to all of the more restrictive Internet regulation – this problem was being tackled at warp speed as well. But so why was it OK for the US to do this, but not for China? When Tom asked his colleagues, in as couched a language he could master, and in as informal a setting as he could stage, the answer amounted to the usual excuse: the end justifies the means – some of these things may indeed not look morally right, but then they are by virtue of the morality of the outcome. But what was the outcome? What were the interests of the US here really? At first thought, mass surveillance and democracy do not seem to rhyme with each, do they?

While privately being critical, Tom was intelligent enough to understand that it did not matter really. Technology usually moves ahead at its own pace, regardless of such philosophical or societal concerns, and new breakthrough technologies, once available, do pervade all of society. It was just a new world order – the Digital Age indeed – and so one had better come to terms with it in one way or another. And, of course, when everything is said and done, one would rather want to live in the US than in China, isn’t it?

When Tom thought about these things, M’s Beautiful Mind appeared to him as somewhat less beautiful. His initial distrust had paid off: he didn’t think he had revealed anything particularly disturbing, despite the orange attitude indicators. He found it ironic he had actually climbed up quite a bit on this new career ladder: from patient to business strategist. Phew! However, despite this, he still felt a bit like an outsider. But then he told himself he had always felt like this – and that he had better come to terms with that too.