client guides · Clock12 min · 06 Jul 2020

Why Google and Apple May Remove Your App and How to Deal With That?

Anastasiya Kharychkova

Anastasiya Kharychkova

Project Manager

why was my app removed from the play market
Illustration by Amir Kerr

The mobile app industry is growing at a steady rate without any signs of slowing down in the near future. Specialists are predicting mobile apps to generate $935 billion by the end of 2023. According to Statista, there are 1.84 million apps available for download on the Apple App Store and 2.56 million apps on the Google Play Store. As you are reading this, hundreds of thousands of new applications are awaiting publication. It could be really tempting to enter the dynamic and profitable industry of mobile apps.

Worldwide mobile app revenues 2014-2023.png
Worldwide mobile app revenues 2014-2023. Source: Statista

Let’s say you decided to launch a new mobile app. You raised the money, built the dream team, developed the product, and are ready to run a marketing campaign. Now you think the most crucial and complicated phases are behind you. But none of your work will count if Apple or Google removes your application. 

The process of passing the review and getting published could be a challenge. Over the years, Apple and Google have created policy documents on how apps need to be developed, updated, and promoted. These requirements are meant to support the high-quality standards of both stores.

There are several pitfalls that could determine the destiny of your app. If you encounter these, it won’t matter how many years your app has been in stores, how popular and famous your company is, or how carefully you have followed the guidelines and the policies. 

Related: How to Publish an Android App on Google Play Store: A Step-by-Step Guide

Let’s discuss the most common reasons for app removal so you could do your best to avoid tricky situations. 


Both Apple and Google are constantly perfecting their algorithms to find applications with malicious content. You may say that this wouldn’t apply to you because you are not going to implement anything illegal. Unfortunately, oftentimes app creators break the rules unknowingly. For example, implementing third-party code from different sources can lead to suspicious software with disruptive ads or illegal data collection. 

In October 2019, security companies ESET and Wandera found more than 50 dangerous apps that Apple and Google immediately removed from the stores.

By that time, Android applications with a new strain of adware called Ashas had already been downloaded more than 8 million times. Ashas involved fullscreen ads and collected details about the device type, OS version, language, number of installed apps, free storage space, battery status, and so on.

iOS apps, in turn, opened links and webpages in the background and generated additional clicks to increase the owner’s revenue through trojan-clicker malware. 

Disruptive adware

Disruptive adware is advertising that almost blocks the core functionality of an app and even the device itself. For example, the app would show ads while a user is trying to make a phone call or unlock the device. 

In February 2020, Google announced that nearly 600 applications had been removed from the Google Play Store for disruptive ads. 30 more applications with 20 million users were deleted just on June 19th, 2020. Some of these applications are among the most popular on the Play Store, such as Lite Beauty Camera (1 million installs), Beauty Collage Lite (500k installs), Sunny Beauty Camera (1 million installs). 

However, even this is not the end of the war on disruptive adware. Google shared information about developing a new machine-learning algorithm that will detect ads that are shown out of context.

Clone apps and plagiarism 

Not all of us are able to generate genius ideas from scratch. There is a saying: all new is well forgotten old. Some take it literally. But to run a successful business, you often would take an existing idea and add some new valuable features to stand out among the competitors. That’s a crucial point not only for profitability but also for staying on the App Store and Play Market. Both giants are pretty attentive to apps’ value (of which we’ll speak more later) and uniqueness.  

One of the most high-profile cases in relation to plagiarism is the case of the Chinese app Zynn. 

Zynn is a twin of another Chinese video-sharing app TikTok. The only difference between them is Zynn’s engagement strategy. Zynn paid different sums of money to its users for bringing their friends to the app. This strategy seems to have worked out. Just within the first week after its release, Zynn got into the top 10 most downloaded apps. 

Suddenly, at the beginning of June 2020, the app was removed from the Play Market and a few days later from the App Store. The decision was made based on complaints from numerous TikTok users. Their accounts with personal data and media content were copied to Zynn without permission.  

The company’s spokesperson said: “This is an isolated incident that has triggered a routine investigation from Google’s platform. [...] This [plagiarism issue] came about as Zynn was gaining popularity in the US, and many users were uploading content and building their communities. Zynn has always been an ardent supporter of original content. Zynn respects and requires creators to abide by local copyright laws. Zynn provides users with a one-click complaint feature to protect creators’ rights.”

The investigation is ongoing, and Zynn is yet to prove its innocence. It could be a long process considering the ongoing battle for market leadership between Zynn and TikTok.  

Access to data 

This is a weak spot for many apps. It’s important to note that Apple is much stricter in the area of user data security than Google. Think twice before asking for access to any kind of data. If you have to ask, think carefully about the reason for asking and make sure to explain it to your users and Apple as clearly as possible. However, be prepared that Apple could decide your app functionality doesn’t really need access to the requested data for a variety of reasons. 

In May 2018, thousands of apps were accused of sharing location information with third parties without explicit permission from users. These apps were removed from the App Store for violating Apple’s location data policy. Apple’s actions are totally justified here. 

However, the same cannot be said about the policy on some screen-time and parental-control apps. Over the past few years, Apple hunted down numerous apps from this category. Some of them were forced to give up their core functionality, which allowed adults to control kids’ smartphones and their access to apps and content. Others were simply removed.  

It all happened shortly after Tim Cook announced Apple’s phone addiction control tools at the WWDC 2018. Businesses don't like such coincidences. Kidslox and Qustodio sent a complaint to the European Union competition office. 

In an interview for the New York Times, Amir Moussavian, CEO of OurPact, a parental-control iPhone app with more than three million downloads, said: “They yanked us out of the blue with no warning. They are systematically killing the industry.” Apple took down his app along with other top players like Mobicip, Kidslox, Qustodio.

Apple denied any connection between this clean-up and the launching of its products. The company claimed that it was all about the system that provided apps with third-party control, access to a device, and information such as location, app use, camera permissions, email accounts, and browser history.

At the same time, it should be noted that Apple allows corporations to use such tools to control employees’ devices. 

All in all, this story has a happy ending. After a few months of proceedings, the removed applications got back onto the App Store. 

Apps and content value

That’s a kind of tricky and subjective reason to delete an app from the store. Anyway, such removals may take place, especially when the app is health-related. 

In 2019, BBC announced that Apple removed 181 vaping apps. They allowed people to control some features of e-cigarettes and stay up to date with news about vaping or offered themed games.

Company executives made the decision based on the latest US research about the vaping impact on people’s health. It stated that 42 deaths and more than 2,100 cases of lung injury had been linked to a respiratory illness tied to vaping.

On the wave of COVID-19 pandemiс, both Apple and Google cracked down on all the coronavirus-related applications that were not published by health organizations or governments. Companies decided to take measures to prevent misinformation and race for financial gain. 

Restricted content

Both Apple and Google present exceptional demands to the content provided and generated in an app. Pay close attention to this list of the most frequent reasons for rejection and deletion from the stores due to inappropriate content:

  • sexually explicit content;
  • things that potentially endanger children;
  • graphic depictions or descriptions of violence or violent threats to any person or animal;
  • instructions on how to engage in violent activities like bomb or weapon-making;
  • self-harm, suicide, eating disorders, choking games or other acts that may result in injury or death;
  • bullying and harassment;
  • hate speech;
  • reference to sensitive events like a disaster, atrocity, conflict, or death gambling;
  • illegal activities;
  • inappropriate user-generated content.

Even if you are sure there is nothing to worry about when it comes to your app, Google and Apple may have another point of view. 

Over the last few years, there has been an increased number of complaints by Android developers. They claim that Google started to remove apps and terminated developers’ accounts for no reason. There are numerous articles on Medium, AndroidPub, and other popular platforms expressing outrage and despair over this issue. Startups and long-run applications with hundreds of thousands of downloads have suffered from unpredictable and unreasonable prosecutions. 

Orangesoft’s case

One day after having published lots of different types of apps, our company got into the same situation.

For a few months, we were working along with our client on a dating app for both Android and iOS. Its core advantage was meant to be a unique matching algorithm powered by artificial intelligence.  The in-app functionality was designed to be as simple as possible: a built-in messenger and personal data management.

Originally we decided to give people an opportunity to find matches for different purposes: love, business, sex, sex&friendship, friendship. 

Having kept in mind the specifics of the application and the strict rules of Apple and Google, we carefully examined the Developer Program policies and especially the parts about the explicit content. 

First, we submitted the app to the Play Market at the end of January 2020. It took a few days, but the app passed the review and was waiting to get published. We planned to submit an iOS version the week after and then make them both public at the same time. 

But a week after the review, we got an email informing us that the app was suspended and removed from the Google Play store. 

Notification about the app removal.png
Notification from Google Play about the app removal

Usually, Google sends an email with a warning giving the chance to make changes and pass the review again. Not in this case. Moreover, a few hours later, we found out that our client’s developer account was terminated without any warning. 

We guessed that the reviewer got the wrong idea about the app because of chat scenario titles like “sex” and “sex&friendship.” We once again reviewed Google’s statement about inappropriate content and didn’t find any mention of a ban on words with a sexual context. What’s more, there are some applications in the Google Play Store that not only use the word "sex," but also such terms as "swingers," "threesome," "BDSM."

☝️On July 8th, 2020, Google announced an update to the Developer program policies. Among others, they added one more paragraph to the Sexual Content and Profanity section:

"Content that is lewd or profane - including but not limited to content which may contain profanity, slurs, explicit text, adult/sexual keywords in the store listing or in-app."

Now that explains why the app could have been deleted from the Play Store.

Within the following week, there was a string of emails and appeals all met by several automated and useless answers from Google. Furthermore, we were puzzled as to why Google decided to terminate the developer’s account. These situations follow a variety of scenarios. Some accounts are terminated after two suspensions/removals, others - just after their first strike

The main issue in such situations is that all emails and appeals are received and answered by Google bots. This seemingly small obstacle renders all your efforts pointless. In this case, you have only two ways to solve the problem:

  • Write articles, posts on social media and platforms like Reddit or Medium, hoping that some people from Google will come across your writing and give your case a proper review.
  • Do your best to meet, call, or email actual Google employees, especially in the positions of Business Developer and Business Relations Developer. You may try to search for them on LinkedIn, Facebook, or among your personal network.

We chose the second option, and soon our client and the owner of the dating app managed to make a detailed appeal to the right person. A few days later, the client's developer account was unblocked. 

We winded up removing the term “sex” and titling the chat scenarios “love,” “business,” “dating” and “friendship.” We then sent an updated .apk file for review. Hopefully, this time it went smoothly. 

Now the app is available for download on both the App Store and the Google Play Store

Government request

If your business idea involves app promotion outside your home country, be sure you’ve carefully studied the foreign laws and the economic, social, cultural, and political landscape. Governments make businesses play by their rules and constantly send tons of requests to remove apps from the stores due to law violations. 

China is undoubtedly the leader on this list, and Apple is often accused of being too cooperative. 

government App Store takedown requests.png
Worldwide government App Store takedown requests. Source: Apple Transparency Report: Government and Private Party Requests Worldwide H2 2018 and H1 2019

It began in 2017 when China deemed VPNs illegal unless they got permission to operate on the country’s territory. Soon after, Apple removed more than 600 VPN apps from China’s App Store. 

According to the latest Apple transparency report, 288 applications were removed during the first half of 2019. The company provided two official reasons for such a radical decision: platform violations, which may mean gambling, which is illegal in China, and legal violations, which often means apps with pornography (also illegal in China). However, Apple has never provided more detailed information.  

At the end of February 2020, Washington Post reported that the virus-spreading simulator Plague is no longer available on the Chinese App Store. Again, it was deleted at the request of the Chinese government. 

The game was published in 2012 and won the audience of more than 130 million players across PC and mobile. Its popularity grew especially with the spread of COVID-19. 

“It’s not clear to us if this removal is linked to the ongoing coronavirus outbreak that China is facing,” said the Ndemic statement. “However, Plague Inc.’s educational importance has been repeatedly recognized by organizations like the CDC, and we are currently working with major global health organizations to determine how we can best support their efforts to contain and control COVID-19.”

Interestingly, other virus-related mobile games, including those developed by Chinese companies and supported by the Chinese government, are still available for download. 

Finally, here comes the cherry on top.

In 2016 China established a law about obligatory licensing for all mobile games. The Play Store enforced the new rule straight away. Apple managed to avoid it, but finally, the time has come. In February 2020 Apple announced that all developers of mobile games with in-app purchases have to submit their licenses by June 30th.

Based on official government data, as of June 2020, China has issued only 43,104 game licenses since 2010. Let’s say all of these licenses were issued to mobile games. But there are around 60,000 mobile games on the China App Store that are paid for or have in-app purchases.

Thereby on July 1st, 2020, Apple stopped updates for more than 20 000 unlicensed mobile game applications. Among the unlicensed games is Netmarble’s Stone Age M that is one of the top-earning mobile game applications. Its revenue in May 2020 amounted to $2 million.

Actually, these license checks won’t affect big companies like PUBG Mobile but could kill medium and small businesses and significantly affect Apple’s income.

Zhu Qinan, an analyst at Zhongtai Securities, said: “Our sampling shows that among the 200 top-grossing games in China’s app store, about 10 percent are unlicensed. The top 200 games contribute to over 80 percent of the App Store game revenue.”

“It takes four to eight months to get a game license, and [the government] only issued a total of 1,572 game licenses in 2019,” said Rich Bishop, AppInChina CEO. “This means that the vast majority of the 21,563 paid games or games with in-app purchases on the Apple App Store [in] China won’t be able to get a game license for several years, let alone by June 30th, 2020.”


After an exhausting and exciting process of creating the product, you are now one step away from your customers. Before you launch, remember these tips: 

Alternatively, you can leave these steps to the experts from the development partner company as they know all the path to a successful app launch inside out. Get in touch to find out how we can help.

You are prepared now. We hope that your application will soon be available for download on the App Store and the Google Play Store. 

Why was my app removed from Google Play?

There are some objective reasons why an app can be removed from Google Play:

  • Malware (apps with malicious content)
  • Disruptive adware (an advertising-supported program that automatically blocks the capabilities of an app)
  • Clone apps and plagiarism (a copy of already existed app)
  • Apps and content value (health-related apps that can cause a damaging effect)
  • Restricted content (most frequent inappropriate content)
  • Government request (the discrepancy between country laws and application content)

Why do apps get taken off the App Store?

Apple is constantly updating the algorithms to strike the apps that don't follow Apple's policies.
Top reasons why your app can be taken off the App Store:

  • Malware (illegally implemented disruptive software)
  • Disruptive adware (trustless software with harmful advertising or illegal data collection)
  • Clone apps and plagiarism (an app with 0 value and uniqueness)
  • Access to data (Apple pay tremendous attention to user data security)
  • Apps and content value (apps that have an impact on people’s health)
  • Restricted content (for example, sexually explicit content, violence or violent threats, bullying and harassment, hate speech, and so on)
  • Government request.

Do apps get deleted for a one-star rating?

Neither Apple nor Google removes the application only because of low ratings. The app can be deleted if it violates the policies of the stores.

Rate this article!

(13 ratings, average: 4.61 out of 5)

Latest Articles