You can also be cheated by ‘AI’s brother and sister’, these people got robbed, you should save yourself like this

AI Voice Scam: New cases of cyber fraud are coming to be highlighted, where scammers are adopting different methods to cheat people. In the last 7 days, such cases have come to light from Delhi and Lucknow, where scammers faked the voice of a relative or a family friend to cheat and cheated several thousand rupees.

Actually, talking by changing their voice is an art that some people have mastered. At present, many people copy the voices of others by using AI technology. After this, they use these voices to cheat people. Let us know about the recent cases of fraud and what to do to prevent it.

New fraud cases:

Fraud took place in Delhi: Actually, a case of cyber fraud came to light in Delhi last week. In this case, the victim was duped by making a fake voice. In this, fake kidnapping was resorted to. Scammers told the victim that your brother’s son had been kidnapped and ultimately he became a victim of cyber fraud of Rs 50 thousand.

Also Read: Intel Core i5 Laptop in India

The new case has come to light from Lucknow: A case has also come to light from Lucknow city of Uttar Pradesh, in which AI-generated voice was used. In this, the victim was defrauded of Rs 44,500. In this, a 25-year-old youth got a call from a person whose voice was like that of one of his relatives. After this, he asked for some money and asked to transfer it through UPI, in the end, he became a victim of fraud of Rs 45 thousand.

More than half of the cases are of AI Voice changing Scam

Recently a report from cyber security firm McAfee was revealed. In this survey, it was claimed that more than half of the cyber fraud cases happening with Indians are related to AI Voice Scam, whose share is 69 per cent. The survey revealed that 47 per cent of Indians are victims of cyber fraud or have heard about this case.

Always take these 5 precautions:

In AI Voice Scam, cyber thugs change the voice of innocent people to make them sound like one of their relatives. After this many people get cheated. Today we will tell you about some special safety tips, with the help of which you can keep yourself safe from AI Voice Scam.

Cross-check once before sending money:

In AI Voice Scam, scammers often call by changing the number and demanding money. In this, he can give tricks ranging from kidnapping to paying EMI. In such a situation, you must cross-check.

Call in case of kidnapping:

If someone tries to cheat you by showing fear of kidnapping, then do not panic, be patient for some time and call the number of the person who has claimed kidnapping. If he picks up the call, the truth will be revealed.

Try to talk longer:

Cyber ​​thugs generate duplicate voices with the help of AI, so keep in mind that you try to talk for some time. During this time, you can talk to other relatives and ask them about the purpose for which they need money. In such a situation he will not be able to respond quickly. After this, it can help you identify fake and real.

Take care of the mechanism sound:

Actually, sometimes mechanism sound occurs in the process of some AI voice changers. In such a situation, if you talk to him for some time, then pay attention to this mechanism sound. After this, you can identify the fake call.

Don’t panic, have some patience:

Actually, in case of kidnapping or someone who needs money, he asks to transfer the money quickly. For this, you mustn’t panic. First cross check about it and try to know the truth.

Leave a Reply

Your email address will not be published. Required fields are marked *