Page 136 - Cyber Defense eMagazine January 2024
P. 136

Enterprises Will Start to Seriously Rethink Their MFA

            If advanced threat actor groups like LUCR-3 taught us anything in their attacks on cloud environments,
            it’s that MFA doesn’t provide the security guarantees we’d like to think it does. Through SIM swapping,
            phishing, and push fatigue, MFA has been something advanced threat actors have found ways around
            over the last  few years, especially  with  those victim  organizations  that allow  SMS as a second  factor.
            We’re  likely  to  see  more  companies  move  away  from  SMS  based  authentication  and  accelerate
            movement  toward  solutions  that  rely on  biometrics  or  hardware  keys  as  MFA  bypass  techniques  will
            continue  to  innovate.  Facial  biometric  technology  and hardware  keys such  as Yubikeys,  for example,
            offer better security guarantees and make it significantly more difficult to bypass. So how will threat actors
            adapt?



            Threat Actor Groups Will Continue to Leverage AI for Evil

            With the increase in the adoption of biometric security for MFA, there will be a growth in the availability
            of toolkits to create deepfakes for purposes of voice or video-based verification and in social engineering
            personnel  involved  in credential  reset workflows.  These toolkits,  like many  others today,  will be easily
            available in underground markets for purchase. These deepfake assets will be critical to help the threat
            actors  orchestrate  sophisticated  impersonation  in  social  engineering  attacks  as  part  of  their  larger
            campaigns.  Groups continue  to be bolder and more sophisticated  with their social  engineering  attacks
            and driven by the success in exploiting the human factor in enterprises they will continue to do so. Many
            commercial platforms have gone to great lengths to prohibit the abuse of LLM models; however, this will
            create  high  demand  by  threat  actors  for  nefarious  Chat-GPT  equivalent  solutions  without  such
            safeguards.  With  the  release  of  powerful  open-source  models  and  the  acceleration  in  public  domain
            research the barrier to creating, training, and maintaining bad actor LLMs has never been lower.

            In short, the attack patterns we saw in 2023 are most likely to continue  into 2024. Modern cloud threat
            actors are moving away from activities like cryptomining that have proven to be less profitable in the last
            year  or  so  and  are  gravitating  toward  more  lucrative  endeavors  such  as  ransomware  and  extortion.
            Because  MFA  bypass  has been  a critical  piece to  gaining  access  into  an  environment,  expect  threat
            actors TTPs to keep up with the measures put in place for more secure MFA such as hardware keys and
            biometrics. It would appear that many advanced threat actor groups are starting to understand cloud and
            the resources available to them that can be leveraged for their own gain. They will continue to orchestrate
            more elaborate  campaigns  against  SaaS  and Cloud Service  Providers  that will yield larger  gains than
            typical  attacks  against  a  single  victim  or tenant.  As  always,  it’s the  responsibility  of security  teams  to
            account for how threat actors’ TTPs are evolving and construct policies and plans that will better address
            those threats.













            Cyber Defense eMagazine – January 2024 Edition                                                                                                                                                                                                          136
            Copyright © 2024, Cyber Defense Magazine. All rights reserved worldwide.
   131   132   133   134   135   136   137   138   139   140   141