Electoral Commission’s warning on deepfakes – proactive security measures are needed

762 Views

The rise of generative AI and new threats such as deepfakes, means that regulations must be updated to combat such risks. In order for regulations to tackle the issue, governments, industries and companies need to come together to share ideas and truly solve the issue.

As well as regulations, it’s important that people are educated on how to spot deepfakes. A deepfake video usually contains inconsistencies that become evident when a face or body moves. An ear may have certain irregularities, or the iris doesn’t show the natural reflection of light.

Technologies with AI techniques such as biometrics, can also be used detect deepfakes and protect the integrity of elections. Anti-spoofing techniques within biometrics can be used to spot differentiators between synthetic and authentic voices. Furthermore, having multi-factor authentication processes, which includes voice biometrics and facial recognitions, makes it much harder to impersonate politicians or electorate spokespeople.

Ultimately, by having proactive security measures to detect and stop deepfakes, the spreading of disinformation before, during and after elections can be limited.