Has my stock been accused of fraud?Join over 160k users who know.

Ticker Price Change($) Change(%) Shares Volume Prev Close Open Gain($) Gain(%)
Ticker Status Jurisdiction Filing Date CP Start CP End CP Loss Deadline
Ticker Case Name Status CP Start CP End Deadline Settlement Amt
Ticker Name Date Analyst Firm Up/Down Target ($) Rating Change Rating Current

News

Trust Stamp Launches Fast-Track Multi-Factor Biometric Authentication For Financial Institutions And Others Vulnerable To Deep Fake Voice Attacks

Author: Benzinga Newsdesk | April 08, 2024 10:01am

Trust Stamp (NASDAQ:IDAI), the Privacy-First Identity CompanyTM, providing AI-powered trust and identity services used globally across multiple sectors, is offering fast-tracked implementation to financial institutions and other enterprises that are currently using voice recognition technologies or accepting voice instructions via telephone calls.

Rapid advances in deep fake technology using Generative-AI have made it possible for bad actors to accurately imitate the voice patterns of an individual giving instructions, whether to automated voice recognition systems or to financial institution staff accepting telephone instructions.

Andrew Gowasack, President of Trust Stamp commented, "We have never offered voice-based authentication because it appeared probable that it would be spoofed by fast advancing AI-technology. Although Open-AI have stated that they are not currently releasing their Voice Engine for public use there are many alternative Generative-AI engines available including open-source models. Our multimodal authentication tool using facial authentication with proof of life, paired with optional device authentication, can quickly be integrated into current authentication systems as an alternative for, or supplement to, voice-based systems and can also be initiated as a stand alone service for high-risk transactions within two to three days of subscription."

Andrew Gowasack further commented, "Although there should be significant focus on attacks on the interaction between the customer and the financial institution, deep fake technology can also be used for attacks within the customer enterprise resulting in the financial institution receiving instructions that have every appearance of being legitimate, having been initiated based upon a fraudulent communication within the enterprise. Fraud of this type is typically commissioned by email via a spear phishing attack, but with voice and video deep fakes it can now be used for instructions given by Zoom or other video technologies and the FBI has reported an increase in the number of incidents using virtual meeting platforms from as long ago as 2019. This is often referred to as "CEO Fraud" and in February of this year we saw a widely publicized example where a finance worker in Hong Kong paid out $25,000,000 based on a video call that included a deep fake representation of his company's CFO. The same technology that we are offering to financial institutions can also be used within enterprises to close the authentication loop before instructions are given to the financial institution."

Posted In: IDAI

CLASS ACTION DEADLINES - JOIN NOW!

NEW CASE INVESTIGATION

CORE Finalist