Sunil Bharti’s ‘AI’ voice as villain; Employee’s ‘wisdom’ prevented loss of crores

Dubai : With the help of artificial intelligence (AI), an attempt was made to defraud by making the voice of an Indian billionaire in Dubai. A large sum was not lost due to the occasional intervention of the employee. Sunil Bharti Mittal, founder and chairman of multinational company Bharti Enterprises, revealed a shocking personal experience.
One of the businessman’s senior executives was tricked into approving a large financial transfer by calling his voice. He said that even he was stunned by the imitation of his voice. One of my senior finance executives in Dubai, who manages our Africa business, received a call on my voice. He was instructed to transfer large sums of money. He had the good sense to realize that I would never make such a request over the phone and got away with it.
The executive, who did not want to be named, said he was suspicious of the voice and immediately reported it. Due to this huge financial losses could be eliminated. The incident comes amid growing concerns globally and in the UAE about the misuse of Al, particularly the technology for Deepfake.

  • Deep Fake; UAE Cyber Security Council Warns

The UAE Cyber Security Council recently warned about the risks of fraud, breach of privacy and misinformation. Deepfakes are Al-generated media designed to mimic real people. Videos, images or audio can be created through this technique, which poses a serious threat to individuals and organizations
The UAE Cyber Security Council has launched an awareness campaign, warning that sharing DeepFeed content could lead to fraud or legal consequences. A recent Kaspersky Business Digitization survey found that while 75 percent of UAE employees believe they can identify Deep Fe, only 37 percent were successful in distinguishing between real and AI-generated images during testing.

Related ARTICLES

POPULAR ARTICLES