client guides · 31 Jul 2022

Why Google and Apple May Remove Your App and How to Deal With That? [+Free Pre-Publish Checklist]

Anastasiya Kharychkova

Anastasiya Kharychkova

Head of Business Development

Illustration by Amir Kerr

The mobile app industry is growing steadily without any signs of slowing down. Experts predict that mobile apps will generate $613 billion by the end of 2025. Further, according to Statista, there are already 2.11 million apps available for download on the Apple App Store and 3.29 million on Google Play. 

As you are reading, hundreds of thousands of new applications are awaiting publication. And based on these stats, it can be tempting to enter the dynamic and profitable industry of mobile apps.

Let’s say you decided to launch a new mobile app. You raised the money, assembled the dream team, developed the product, and are ready to run a marketing campaign. At this point, you may think that the most crucial and complicated phases are behind you, but none of this hard work will matter if Apple or Google denies or removes your application.

Passing the review process and getting your app published can be challenging. Over the years, Apple and Google have created policy documents on how apps must be developed, updated, and promoted. These requirements are meant to support the high-quality standards of both stores.

There are several pitfalls that could determine your app’s destiny. If you encounter any of these, it won’t matter how many years your app has been available, how popular your company is, or how carefully you have followed the guidelines and the policies.

Next, let’s discuss the most common reasons that apps are removed from app stores so you can avoid these situations.


Apple and Google are constantly perfecting their algorithms to find applications with malicious content. You may think this wouldn’t apply to your app because you aren’t implementing illegal content, but that may not be the case. 

Unfortunately, app creators may break the rules unknowingly. This is often the result when apps implement third-party code from different sources that introduce disruptive ads or illegal data collection malware. An example of this was seen in October 2019 when security companies ESET and Wandera found over 50 dangerous apps that Apple and Google immediately removed from their stores. 

By then, Android applications with a new strain of adware called “Ashas” had been downloaded more than eight million times. Ashas involved fullscreen ads and collected information about the user’s device type, OS version, language, number of installed apps, free storage space, battery status, and more. 

For iOS apps, this malware opened links and webpages in the background. Further, it generated additional clicks to increase the owner’s revenue through trojan-clicker malware. 

Another example of this type of removal was in 2021 when a popular messaging-focused app named Color Message (500k Google Play downloads) was found to have the latest Joker malware. This malware simulated clicks to generate revenue from malicious ads and connected to Russian servers. As a result, Color Message would access users' contact lists, infiltrate networks, and subscribe users to paid services.

New malware is still being found today. For example, this year, Google pulled six fake antivirus apps from the Google Play Store for exposing users to “Sharkbot” malware that stole login credentials and bank account information. Thus, you’ll need to ensure your app isn’t exposed to any malware if you want to keep it available on app stores.

Disruptive Adware

Disruptive adware is advertising that almost blocks the core functionality of an app and even the device itself. For example, this may look like an app that shows ads while a user tries to make a phone call or unlock their device. 

In February 2020, Google announced that nearly 600 applications had been removed from Google Play for disruptive ads. Then, 30 more applications with 20 million users were deleted on June 19, 2020. Some of these applications were among the most popular on the Google Play Store, including Lite Beauty Camera (1 million installs), Beauty Collage Lite (500k installs), and Sunny Beauty Camera (1 million installs). 

However, these removals have not ended all disruptive adware. Recently, Google shared information on developing a new machine-learning algorithm to detect ads that are shown out of context, and this will review all new apps, including yours.

Clone Apps and Plagiarism 

Not all of us can generate genius ideas from scratch. There is a saying: “Everything new is well-forgotten old.” Unfortunately, some take this saying too literally, leading them to plagiarize. 

Remember, to run a successful business, you must add new valuable features to stand out among competitors. Being unique is crucial for profitability but also for staying on the App Store and Google Play, as both stores are attentive to apps’ value and uniqueness. 

For instance, search Google Play for a famous brand like "Mario;" you'd be surprised at how many of the Mario apps aren't from Nintendo. Many of these apps likely infringe on Nintendo’s trademark, copyright, or other intellectual rights and will be removed as a result.


In general, these types of infringements could be imagery or music in the app, gameplay mechanics, or character and brand names. Fortunately, it's likely that many of these apps are harmless to the user and just designed to capitalize on brand recognition, but this doesn’t keep them from being removed from stores. 

One of the most high-profile cases in relation to plagiarism is the case of the Chinese app Zynn. Zynn is a clone of another Chinese video-sharing app TikTok

The only difference between them is Zynn’s engagement strategy. Zynn paid different sums of money to its users for bringing their friends to the app. This strategy seems to have worked out; within the first week of release, Zynn reached the top ten most downloaded apps. 

Suddenly, at the beginning of June 2020, the app was removed from Google Play and the App Store. The decision was made based on complaints from numerous TikTok users, as their TikTok accounts, including their personal data and media content, had been copied to Zynn without permission.

Currently, the investigation is ongoing. It is expected to be a long process, given the market leadership struggle between TikTok and Zynn.

Another example of this violation occurred in January 2022, when Apple removed several Wordle-like games from its App Store after users confused them with another word-play title popularized by Jimmy Fallon and other celebrities.

Access to User Data 

When it comes to user data security, it’s vital to note that Apple is much stricter than Google, so you’ll need to think twice before asking for access to any kind of user data. If you have to ask, think carefully about your reason for asking and explain it to your users clearly. However, be aware that Apple or Google may decide your app functionality doesn’t need access to the requested data. 

In June 2020, Apple announced a new privacy information section for product pages on the App Store. This was the beginning of an innovative new program to help customers have more transparency and understanding about what data apps may be gathering their information. Further, this new program creates an easy-to-understand system for all apps, where this information is self-reported by the developer. 

Google Play supported this idea, and in May 2021, Google announced a new "Data Security" section. Now, developers are required to disclose information necessary to provide the application with information about data collection and sharing and security practices.

Google also updated Google Play Protect in 2021. A new update accompanied by an email from Google reveals that its Play Protect program will now automatically remove unused permissions from Android applications that you haven’t used in a while. Until now, you’d have to manually visit each app and its permissions to toggle off the camera, microphone, storage, contacts, and phone that the developer could access on your device so that you would have the full experience they intended.

App and Content Value

App and content value removals are mostly based on subjective reasoning. Seemingly, these removals may occur more often when the app is health-related. 

For instance, in 2019, BBC announced that Apple removed 181 vaping apps. Some of the removed apps allowed users to control their e-cigarettes’ features, while others helped users stay up to date with news about vaping or offered themed games.

Apple company executives decided to remove these apps after the latest US research about the negative impact that vaping has on health was released. In this research, the CDC stated that 42 deaths and more than 2,100 lung injury cases had been linked to a respiratory illness tied to vaping.

Another example came after the start of the COVID-19 pandemiс. Both Apple and Google cracked down on all the coronavirus-related applications that were not published by health organizations or governments to prevent the spread of misinformation and exploitation for financial gain.

Abandoned Apps 

Both platforms have a significant amount of abandoned apps, which typically refers to apps that have not been updated in two years. In Google's case, this figure amounts to 869,000 apps, while Apple has around 650,000. 

Google and Apple have both taken measures to deal with abandoned apps. Specifically, Google is preparing to hide those apps, making it impossible for users to download abandoned apps until they are updated by the developers.

In April 2022, Apple changed to the App Store Improvements process. As a result, apps that had not been updated within the last three years and failed to meet a minimal download threshold were identified for possible removal from the App Store. For the identified apps, their developers were given 90 days to update the apps. 

The main reason both companies are taking these measures is to protect their users’ security. Older apps don't take advantage of changes in Android and iOS, new APIs, or new development methods. As a result, older apps can have security flaws that newer apps don’t, meaning it is critical to keep your apps updated.

Restricted Content

Both Apple and Google present restrictions on the content an app provides. The list below shares the most frequent reasons that apps are removed due to their content. To keep your apps in stores, make sure you avoid content that includes:

  • sexual or pornographic explicit content;
  • defamatory, discriminatory, or mean-spirited content;
  • things that potentially endanger children;
  • graphic depictions or descriptions of violence or violent threats to any person or animal;
  • bullying and harassment;
  • realistic portrayals of people or animals being killed, maimed, tortured, or abused;
  • instructions on how to engage in violent activities like bomb or weapon-making or facilitate the purchase of firearms or ammunition;
  • self-harm, suicide, eating disorders, choking games, or other acts that may result in injury or death;
  • bullying and harassment;
  • hate speech;
  • reference to sensitive events like a disaster, atrocity, conflict, or death gambling;
  • illegal activities;
  • inflammatory religious commentary or inaccurate or misleading quotations of religious texts;
  • inappropriate user-generated content.

Even if you are sure there is nothing to worry about when it comes to your app, Google and Apple may have another point of view. Over the last few years, there has been an increased number of complaints by Android developers, claiming that Google started to remove apps and terminated developers’ accounts for no reason. 

Numerous articles on Medium, AndroidPub, and other popular platforms express outrage over the same issue. Startups and long-run applications with hundreds of thousands of downloads have suffered from unpredictable removals due to unclear content guidelines, so it is best to strictly follow the content list provided above.

Orangesoft’s case

One day after having published lots of different types of apps, our company got into the same situation.

For a few months, we were working along with our client on a dating app for both Android and iOS. Its core advantage was meant to be a unique matching algorithm powered by artificial intelligence.  The in-app functionality was designed to be as simple as possible: a built-in messenger and personal data management.

Originally we decided to give people an opportunity to find matches for different purposes: love, business, sex, sex&friendship, friendship. 

Having kept in mind the specifics of the application and the strict rules of Apple and Google, we carefully examined the Developer Program policies and especially the parts about the explicit content. 

First, we submitted the app to the Play Market at the end of January 2020. It took a few days, but the app passed the review and was waiting to get published. We planned to submit an iOS version the week after and then make them both public at the same time. 

But a week after the review, we got an email informing us that the app was suspended and removed from the Google Play store. 

Notification about the app removal.png
Notification from Google Play about the app removal

Usually, Google sends an email with a warning giving the chance to make changes and pass the review again. Not in this case. Moreover, a few hours later, we found out that our client’s developer account was terminated without any warning. 

We guessed that the reviewer got the wrong idea about the app because of chat scenario titles like “sex” and “sex&friendship.” We once again reviewed Google’s statement about inappropriate content and didn’t find any mention of a ban on words with a sexual context. What’s more, there are some applications in the Google Play Store that not only use the word "sex," but also such terms as "swingers," "threesome," "BDSM."

☝️On July 8th, 2020, Google announced an update to the Developer program policies. Among others, they added one more paragraph to the Sexual Content and Profanity section:

"Content that is lewd or profane - including but not limited to content which may contain profanity, slurs, explicit text, adult/sexual keywords in the store listing or in-app."

Now that explains why the app could have been deleted from the Play Store.

Within the following week, there was a string of emails and appeals all met by several automated and useless answers from Google. Furthermore, we were puzzled as to why Google decided to terminate the developer’s account. These situations follow a variety of scenarios. Some accounts are terminated after two suspensions/removals, others - just after their first strike. 

The main issue in such situations is that all emails and appeals are received and answered by Google bots. This seemingly small obstacle renders all your efforts pointless. In this case, you have only two ways to solve the problem:

  • Write articles, posts on social media and platforms like Reddit or Medium, hoping that some people from Google will come across your writing and give your case a proper review.
  • Do your best to meet, call, or email actual Google employees, especially in the positions of Business Developer and Business Relations Developer. You may try to search for them on LinkedIn, Facebook, or among your personal network.

We chose the second option, and soon our client and the owner of the dating app managed to make a detailed appeal to the right person. A few days later, the client's developer account was unblocked. 

We winded up removing the term “sex” and titling the chat scenarios “love,” “business,” “dating” and “friendship.” We then sent an updated .apk file for review. Hopefully, this time it went smoothly. 

Now the app is available for download on both the App Store and the Google Play Store. 

Related: List of Alternative App Stores for iOS and Android

Government Request

If your business requires promotion outside your home country, be sure you’ve carefully studied their laws and economic, social, cultural, and political landscape; this is because many governments send requests to remove apps from the stores due to legal violations. China is undoubtedly the leader on this list, and Apple is often accused of being too cooperative.

takedown (1).png
Worldwide government App Store takedown requests. Source: Apple Transparency Report: Government and Private Party Requests Worldwide

For an example of this removal type, let’s summarize China’s app removal requests. This timeline begins in 2017 when China deemed VPNs illegal unless they got permission to operate on the country’s territory. Soon after, Apple removed more than 600 VPN apps from China’s App Store.

According to the latest Apple transparency report, 123 applications were removed during the first half of 2021. Apple provided two official reasons for these removals, including platform violations and legal violations, which likely mean illegal gambling and pornographic content, respectively, as both are illegal in China.

Next, at the end of February 2020, Washington Post reported that the virus-spreading simulator with 130 million players, Plague, is no longer available on the Chinese App Store. The Plague app was also deleted at the request of the Chinese government. 

Finally, in 2016, China established a law about obligatory licensing for all paid mobile games. Google Play enforced the new rule immediately, but Apple held off until February 2020, when they finally announced that all developers of mobile games with in-app purchases had to submit their licenses by June 30, 2020.

According to official government data, as of June 2020, China has issued only 43,104 game licenses since 2010, but around 60,000 mobile games are found on the China App Store that require payment or have in-app purchases. Zhu Qinan, an analyst at Zhongtai Securities, said: “Our sampling shows that among the 200 top-grossing games in China’s app store, about ten percent are unlicensed. The top 200 games contribute to over 80 percent of the App Store game revenue.”

“It takes four to eight months to get a game license, and [the government] only issued a total of 1,572 game licenses in 2019,” said Rich Bishop, AppInChina CEO. “This means that the vast majority of the 21,563 paid games or games with in-app purchases on the Apple App Store [in] China won’t be able to get a game license for several years, let alone by June 30, 2020.”

For reasons like this app removal timeline in China, you’ll want to ensure you know the laws in the countries you’ll be sharing your app with. This way, you can understand how to best prepare your app and avoid removals by government requests.

Pre-publish Checklist

Before you publish your application, we recommend you inspect it by following the pre-publish checklist that our consultants have prepared:


After an exhausting and exciting process of creating the product, you are now one step away from your customers. Before you launch, remember these quick tips: 

Alternatively, you can leave these steps to experts from a development partner company as they implicitly know all the paths that lead to a successful app launch. Get in touch to find out how we can help you with this process.

You are prepared now. We hope that your application will soon be available for download on the App Store and Google Play.

Why was my app removed from Google Play?

There are some objective reasons why an app can be removed from Google Play:

  • Malware (apps with malicious content)
  • Disruptive adware (an advertising-supported program that automatically blocks the capabilities of an app)
  • Clone apps and plagiarism (a copy of already existed app)
  • Apps and content value (health-related apps that can cause a damaging effect)
  • Restricted content (most frequent inappropriate content)
  • Government request (the discrepancy between country laws and application content)

Why do apps get taken off the App Store?

Apple is constantly updating the algorithms to strike the apps that don't follow Apple's policies.
Top reasons why your app can be taken off the App Store:

  • Malware (illegally implemented disruptive software)
  • Disruptive adware (trustless software with harmful advertising or illegal data collection)
  • Clone apps and plagiarism (an app with 0 value and uniqueness)
  • Access to data (Apple pay tremendous attention to user data security)
  • Apps and content value (apps that have an impact on people’s health)
  • Restricted content (for example, sexually explicit content, violence or violent threats, bullying and harassment, hate speech, and so on)
  • Government request.

Do apps get deleted for a one-star rating?

Neither Apple nor Google removes the application only because of low ratings. The app can be deleted if it violates the policies of the stores.

Why are apps removed from the App Store?

Applications that violate any of the App Store policies and rules are removed or hidden from the users, making it impossible for users to download the app. Malware, plagiarism, and restricted content are one of the reasons for app removal.

How many apps have been removed from the App Store?

In 2022, the app store rejected nearly 1.7 million app submissions due to policy violations, while almost 187,000 apps were blocked or removed from the App Store. Also, Apple shut down over 428,000 developer accounts for potentially fraudulent activity.

Rate this article!

(19 ratings, average: 4.5 out of 5)