Companies Don’t Take Deepfake Seriously Could Pay Heavily

Life for companies is as hard as it is for any other living being. The struggles of business life are harsh: the dangers are large and the losses are great. What happens to an individual impacts upon them, their family and the people around them. But when a negative event happens to a company, depending on its size, thousands of its employees, their families and sometimes even their cities, sectors or in the worst instances the entire country could be damaged. 

All managers, including CEOs, act as the intelligent and decision making centre of their companies, and as such are forced to form a relationship with “Artificial Intelligence” (AI). AI is ground breaking in its knowledge, skills and in its learning capabilities. Companies need to correctly establish their distance regarding AI, and they need to correctly establish how much they will include AI within their business processes. Careful use of AI can increase efficiency and profitability, but it is also important to judge when, and how, to stay away from it. It should be noted that while the business world is making hundreds of millions of dollars of investment in machine learning based artificial intelligence technology, they are also creating a “technology monster” which raises growing concerns about a loss of control in the future. Incredible criminal methods, acts of terror and even wars show that when a new technology is developed the user base for that technology in the future is unpredictable. 

Deep fake is problematic for companies.


Regardless of the size and sector of a company and its managers, if they underestimate the popular 3 Ps (policy, porn and police crimes), particularly deep fake videos, they might end up paying a heavy price. If company managers ignore deep fake videos by saying, “let sleeping dogs lie” they might be too late to develop an antidote for the deadly deep fake poison. Deep fake attackers could be a competitor company, a professional criminal, a former or current employee who wants to damage the company or even a shareholder. In this social media era a deep fake video can destroy the market, profitability, and reputation of a company or brand in seconds. After the attack attempting to tell the reality of the situation will not compensate for the deep damage, and for what the company has lost.

According to New Knowledge (formed by national security, digital media and smart machine learning experts), and reported by them in the Information Integrity Organization Brand Disinformation Impact Report on 2019, 78% of consumers believe that fake news and incorrect knowledge damage institutional and brand image. There are two main reasons why deep fake threats against companies are so frightening: there are unlimited attack scenarios, and due to the machine learning power of artificial intelligence they can be transformed into the perfect crime tool. This makes it extremely difficult for companies to take precautions as there are numerous ways to attack targeted companies with deep fake videos resulting in untold damage. Examples are:

-A deep fake video depicting a company employee, or a customer, at a brand sales point facing maltreatment or harassment. As a result all reputation and trust in that company would be damaged. Their market would go, and their sales would collapse.

-A deep fake video portraying inaccurate speech or behaviour coming from a a brand, or its company manager, relating to the financial status of the company. This could greatly damage the financial clout, credibility, partnership goals or shares of a company. Public perception regarding the misbehaviour of a company manager can also lead to these results. Celebrities started to become the target of such attacks with Tesla shares losing 6% of their value after an unusual podcast where its CEO Elon Musk smoked weed and drank whiskey. Indeed it may have led to two senior members of management announcing that they were leaving Tesla. Additionally, according to CNBC News, the US Air Force also started an investigation on Musk regarding his SpaceX project contracts. 

In a phone or video communication within a company, voices and/or the appearance of upper level management staff can be imitated with artificial intelligence focused on deep fake applications. Using synthetic audio or video recordings of its manage companies could become the victim of fraud costing enormous monetary damage. For example around one month ago a money transfer called “fake voiced CEO command” led to a quarter of a million dollars in costs to the company in question. 

Deepfake video threat turns HR management into a nightmare.


Numerous problems within the companies related to human resources (HR), and also to the management of related decisions, might turn into a nightmare due to danger regarding deep fake. A jealous co-worker could generate a deep fake video depicting drug use or theft which could lead to another employee being fired. With smart machine learning it only takes 53 seconds to turn a normal video into a fake video. When deep fake videos go viral the damage has already been done. Attention must be paid to this sneaky tool as it threatens our democratic society, and our legal system. 

Currently awareness is the strongest shield.


Awareness is currently the strongest protection for companies. Starting with CEOs and all other managers, companies should learn the idiom “not to believe what they see and hear”, and not to take decisions before checking their content is accurate in this deep fake era. Companies might also employ an expert data analysis team to support them in this endeavour. In addition companies must trust that individuals will quickly develop technology within the field of Information Security which will permit the verification of content, as well as and the security solutions necessary to distinguish fake videos from the real ones. Naturally the most effective security software against deep fake will be provided to corporate users as soon as is possible, and these companies should invest to keep in line with the reality of the world.