I’m backing up my Samsung Messages before it’s too late – 2 free and easy methods


Samsung Galaxy S23 Ultra with the S Pen beside it.

Kerry Wan/ZDNET

Follow ZDNET: Add us as a preferred source on Google.


ZDNET’s key takeaways

  • Samsung is ending its messaging app sometime in July.
  • Android 12 or newer users must switch to Google Messages.
  • Back up your messages to Samsung Cloud or Google Drive.

The reckoning is here. Well, almost here, and maybe a little less dramatic than I’m describing. Samsung recently confirmed that it’ll be shuttering its own Messages app in favor of Google’s, requiring users on Samsung phones running Android 12 or newer to make the swap in July if they haven’t done so already.

This sunsetting marks a nearly 16-year run for the proprietary communication platform, one that has built quite the cult following over time. But no one should be surprised to see this coming. Just two years ago, Samsung began shipping its latest Galaxy phones with Google Messages set as the default messaging app. More recently, you wouldn’t even be able to download the Samsung Messages app on phones like the Galaxy S26 series.

Also: Samsung is ending Messages in July: 5 replacements I’d switch to now

So, what gives? There are plenty of unofficial theories behind the transition, but the prevailing assumption is simple: Samsung no longer wants the burden of managing its own messaging servers. It makes more sense to hand the reins to Google, which has steadily built a platform home to billions of Android users. Between the practical, security, and financial ramifications, the move just makes sense for the business.

That said, if you’re like me and want to back up any important messages from over the years, you’ve got options. I’ll go over the most reliable ones below, forgoing the ones that involve third-party services that may put your personal information at risk.

1. Backing up your messages locally

Samsung portable SSD T9

Kyle Kucharski/ZDNET

Your most secure backup option is a local transfer to an external storage drive, or SSD. Depending on how many months’ or years’ worth of messages you’ll be moving, you’ll want to make sure the drive has enough memory to host the data.

This is often a good time to look over your various threads and conversations and delete the ones you don’t want (or have) to store, such as 2FA and OTA codes, transaction histories, and receipts.

To back up your Samsung Messages texts, connect your SSD to your phone, open the Smart Switch app by going to Settings > Accounts and backup > Smart Switch, select the storage icon in the top right corner, then Back up, and tap Messages. The phone should begin packaging your texts into a readable file.

2. Backing up your messages to the cloud

It can be intimidating for some, but cloud backups are much more reliable and handy than they used to be, thanks to security enhancements to services like Samsung Knox, which comes built in on most modern Samsung phones. It’s also an intuitive way to back up and restore files — not just text messages — when you eventually switch to a new device.

Also: How I saved myself $1,200 a year in cloud storage – in 5 sobering steps

There are two native cloud services that you can back up your Samsung Messages to: Samsung Cloud and Google Drive. Personally, I’d recommend the latter, as it’s more accessible across phones and PCs, including non-Samsung devices. Google Drive also comes with 15GB of free storage, versus Samsung’s 5GB, in case you haven’t been taking advantage of that benefit.

Here’s the breakdown of the two services to help you better decide. If it helps, there’s an argument for backing up to both services.

Feature Samsung Cloud Google One/Drive
Ideal for Galaxy-to-Galaxy transitions Multi-device and cross-brand sync
Free storage amount 5GB (varies by region/plan) 15GB (shared with Gmail/Drive)
Photos/videos Synced via Microsoft OneDrive Managed via Google Photos
Device settings Deep backup of Home screen, alarms, etc. Basic Android settings backup
App support Samsung apps (Notes, Calendar) Google apps and third-party app data
Accessibility Best on Samsung devices Accessible on web, iOS, and Android

For both services, you can back up your texts by opening Settings > Accounts and backup > Back up data under Samsung Cloud and/or Google Drive. You’ll be required to sign in to either your Samsung account or Google account, depending on the service.

If the process is done properly, the respective settings pages should show your last backed-up timestamp, which should be today’s date.

What happens next?

With your text messages backed up, I’d recommend you start testing Google Messages, or alternative communication platforms, to see which one you’d prefer before Samsung ultimately pulls the plug in July. While the company hasn’t confirmed the exact date of the sunsetting, it never hurts to prepare yourself ahead of time.

Also: How to use Google Messages’ new Trash feature to recover texts you accidentally deleted

Personally, I’m switching to Google Messages. It may not have the same personalization features as Samsung’s, but it has its advantages, like a more universal RCS system that isn’t limited to any specific carrier, some useful Gemini integrations, smart replies and native AI image generation, and, from what I’ve experienced, a more reliable spam detector. We may even see some newer features as soon as later this month.





Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


  • Law establishes national prohibition against nonconsensual online publication of intimate images of individuals, both authentic and computer-generated.
  • First federal law regulating AI-generated content.
  • Creates requirement that covered platforms promptly remove depictions upon receiving notice of their existence and a valid takedown request.
  • For many online service providers, complying with the Take It Down Act’s notice-and-takedown requirement may warrant revising their existing DMCA takedown notice provisions and processes.
  • Another carve-out to CDA immunity? More like a dichotomy of sorts…. 

On May 19, 2025, President Trump signed the bipartisan-supported Take it Down Act into law. The law prohibits any person from using an “interactive computer service” to publish, or threaten to publish, nonconsensual intimate imagery (NCII), including AI-generated NCII (colloquially known as revenge pornography or deepfake revenge pornography). Additionally, the law requires that, within one year of enactment, social media companies and other covered platforms implement a notice-and-takedown mechanism that allows victims to report NCII.  Platforms must then remove properly reported imagery (and any known identical copies) within 48 hours of receiving a compliant request.

Support for the Act and Concerns

The Take it Down Act attempts to fill a void in the policymaking space, as many states had not enacted legislation regulating sexual deepfakes when it was signed into law. The Act has been described as the first major federal law that addresses harm caused by AI. It passed the Senate in February of this year by unanimous consent and passed the House of Representatives in April by a vote of 409-2. It also drew the support of many leading technology companies.

Despite receiving almost unanimous support in Congress, some digital privacy advocates have expressed some concerns that the new notice-and-takedown mechanism could have some unintended consequences for digital privacy in general.  For example, some commentators have suggested that the statute’s takedown provision is written too broadly and lacks sufficient safeguards against frivolous requests, potentially leading to the removal of lawful content –especially given the short 48-hour time to act following a takedown request.  [Note: In 2023, we similarly wrote about abuses of the takedown provision of the Digital Millennium Copyright Act]. In addition, some have argued that the law could undermine end-to-end encryption by possibly forcing such companies to “break” encryption to comply with the removal process.  Supporters of the law have countered that private encrypted messages would likely not be considered “published” under the text of the statute (which uses the term “publish” as opposed to “distribute”).

Criminalization of NCII Publication for Individuals

The Act makes it unlawful for any person “to use an interactive computer service to knowingly publish an intimate visual depiction of an identifiable individual” under certain circumstances.[1] It also prohibits threats involving the publishing of NCII and establishes various criminal penalties. Notably, the Act does not distinguish between authentic and AI-generated NCII in its penalties section if the content has been published. Furthermore, the Act expressly states that a victim’s prior consent to the creation of the original image or its disclosure to another individual does not constitute consent for its publication.

New Notice-and-Takedown Requirement for “Covered Platforms”

Along with punishing individuals who publish NCII, the Take it Down Act requires covered platforms to create a notice-and-takedown process for NCII within one year of the law’s passage. Below are the main points for platforms to consider:

  • Covered Platforms. The Act defines a “covered platform” as a “website, online service, online application, or mobile application” that serves the public and either provides a forum for user-generated content (including messages, videos, images, games, and audio files) or regularly deals with NCII as part of its business.
  • Notice-and-Takedown Process. Covered platforms must create a process through which victims of NCII (or someone authorized to act on their behalf) can send notice to them about the existence of such material (including a statement indicating a “good faith belief” that the intimate visual depiction of the individual is nonconsensual, along with information to assist in locating the unlawful image) and can request its removal.
  • Notice to Users. Adding an additional compliance item to the checklist, the Act requires covered platforms to provide a “clear and conspicuous” notice of the Act’s notice and removal process, such as through a conspicuous link to another web page or disclosure.
  • Removal of NCII. Within 48 hours of receiving a valid removal request, covered platforms must remove the NCII and “make reasonable efforts to identify and remove any known identical copies.”
  • Enforcement. Compliance under this provision will be enforced by the Federal Trade Commission (FTC).
  • Safe Harbor. Under the law, covered platforms will not be held liable for “good faith” removal of content that is claimed to be NCII “based on facts or circumstances from which the unlawful publishing of an intimate visual depiction is apparent,” even if it is later determined that the removed content was lawfully published.

Compliance Note: For many online service providers, complying with the Take It Down Act’s notice-and-takedown requirement may warrant revising their existing DMCA takedown notice provisions and processes, especially if those processes have not been reviewed or updated for some time.  Many “covered platforms” may rely on automated processes (or a combination of automated efforts combined with targeted human oversight) to fulfill Take It Down Act requests and meet the related obligation to make “reasonable efforts” to identify and remove known identical copies.  This may involve using tools for processing notices, removing content and detecting duplicates. As a result, some providers should consider whether their existing takedown provisions should also be amended to address these new requirements and how they will implement these new compliance items on the backend using the infrastructure already in place for the DMCA.

What about CDA Section 230?

Section 230 of the Communications Decency Act (“CDA”), 47 U.S.C § 230, prohibits a “provider or user of an interactive computer service” from being held responsible “as the publisher or speaker of any information provided by another information content provider.” Courts have construed the immunity provisions in Section 230 broadly in a variety of cases arising from the publication of user-generated content. 

Following enactment of the Take It Down Act, some important questions for platforms are: (1) whether Section 230 still protects platforms from actions related to the hosting or removal of NCII; and (2) whether FTC enforcement of the Take It Down Act’s platform notice-and-takedown process is blocked or limited by CDA immunity. 

On first blush, it might seem that the CDA would restrict enforcement against online providers in this area, as decisions regarding the hosting and removal of third party content would necessarily treat a covered platform as a “publisher or speaker” of third party content. However, a deeper examination of the text of the CDA suggests the answer is more nuanced.

It should be noted that the Good Samaritan provision of the CDA (47 U.S.C § 230(c)(2)) could be used by online providers as a shield from liability for actions taken to proactively filter or remove third party NCII content or remove NCII at the direction of a user’s notice under the Take It Down Act, as CDA immunity extends to good faith actions to restrict access to or availability of material that the provider or user considers to be “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.” Moreover, the Take It Down Act adds its own safe harbor for online providers for “good faith disabling of access to, or removal of, material claimed to be a nonconsensual intimate visual depiction based on facts or circumstances from which the unlawful publishing of an intimate visual depiction is apparent, regardless of whether the intimate visual depiction is ultimately determined to be unlawful or not.” 

Still, further questions about the reach of the CDA prove more intriguing. The Take It Down Act appears to create a dichotomy of sorts regarding CDA immunity in the context of NCII removal claims.  Under the text of the CDA, it appears that immunity would not limit FTC enforcement of the Take It Down Act’s notice-and-takedown provision affecting “covered platforms.” To explore this issue, it’s important to examine the CDA’s exceptions, specifically 47 U.S.C § 230(e)(1).   

Effect on other laws

(1) No effect on criminal law

Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title [i.e., the Communications Act], chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.

Under the text of the CDA’s exception, Congress carved out Section 223 and 231 of the Communications Act from the CDA’s scope of immunity.  Since the Take It Down Act states that it will be codified at Section 223 of the Communications Act of 1934 (i.e., 47 U.S.C. 223(h)), it appears that platforms would not enjoy CDA protection from FTC civil enforcement actions based on the agency’s authority to enforce the Act’s requirements that covered platforms “reasonably comply” with the new Take It Down Act notice-and-takedown obligations.

However, that is not the end of the analysis for platforms.  Interestingly, it would appear that platforms would generally still retain CDA protection (subject to any exceptions) from claims related to the hosting or publishing third party NCII that have not been the subject of a Take It Down Act notice, since the Act’s requirements for removal of NCII by platforms would not be implicated without a valid removal request.[2]  Similarly, a platform could make a strong argument that it retains CDA immunity from any claims brought by an individual (rather than the FTC) for failing to reasonably comply with a Take It Down Act notice.  That said, it is conceivable that litigants – or event state attorneys general – might attempt to frame such legal actions under consumer protection statutes, as the Take It Down Act states that a failure to reasonably comply with an NCII takedown request is an unfair or deceptive trade practice under the FTC Act.  Even in such a case, platforms would likely contend that such claims by these non-FTC parties are merely claims based on a platform’s role as publisher of third party content and are therefore barred by the CDA. 

Ultimately, most, if not all, platforms will likely make best efforts to reasonably comply with the Take It Down Act, thus avoiding the above contingencies.  Yet, for platforms using automated systems to process takedown requests, unintended errors may occur and it’s important to understand how and when the CDA would still protect platforms against any related claims.

Looking Ahead

It will be up to a year before the notice-and-takedown requirements become effective, so we will have to wait and see how well the process works in eradicating revenge pornography material and intimate AI deepfakes from platforms, how the Act potentially affects messaging platforms, how aggressively the Department of Justice will prosecute offenders, and how closely the FTC will be monitoring online platforms’ compliance with the new takedown requirements.

It also remains to be seen whether Congress has an appetite to pass more AI legislation. Less than two weeks before the Take it Down Act was signed into law, the Senate Committee on Commerce, Science, and Transportation held a hearing on “Winning the AI Race” that featured the CEOs of many well-known AI companies. During the hearing, there was bipartisan agreement on the importance of sustaining America’s leadership in AI, expanding the AI supply chain and not burdening AI developers with a regulatory framework as strict as the EU AI Act. The senators listened to testimony from tech executives calling for enhanced educational initiatives and the improvement of infrastructure needed for advancing AI innovation, alongside discussing proposed bills regulating the industry, but it was not clear whether any of these potential policy solutions would receive enough support to be signed into law.

The authors would like to thank Aniket C. Mukherji, a Proskauer legal assistant, for his contributions to this post.


[1] The Act provides that the publication of the NCII of an adult is unlawful if (for authentic content) “the intimate visual depiction was obtained or created under circumstances in which the person knew or reasonably should have known the identifiable individual had a reasonable expectation of privacy,” if (for AI-generated content) “the digital forgery was published without the consent of the identifiable individual,” and if (for both authentic and AI-generated content) what is depicted “was not voluntarily exposed by the identifiable individual in a public or commercial setting,” “is not a matter of public concern,” and is intended to cause harm or does cause harm to the identifiable individual. The publication of NCII (whether authentic or AI-generated) of a minor is unlawful if it is published with intent to “abuse, humiliate, harass, or degrade the minor” or “arouse or gratify the sexual desire of any person.” The Act also lists some basic exceptions, such as publications of covered imagery for law enforcement investigations, legal proceedings, or educational purposes, among other things.

[2] Under the Act, “Upon receiving a valid removal request from an identifiable individual (or an authorized person acting on behalf of such individual) using the process described in paragraph (1)(A)(ii), a covered platform shall, as soon as possible, but not later than 48 hours after receiving such request—

(A) remove the intimate visual depiction; and

(B) make reasonable efforts to identify and remove any known identical copies of such depiction.



Source link