We've been affected by these fake DMCA for one months now and there is ZERO support from Shopify.
(thanks to the OP for shedding light on this situation)
Running a business through Shopify has become very risky these days.
We are now working on transferring our online stores to Woocommerce where it is far more difficult to get your content removed with bogus DMCA.
Picture the situation to understand the Nightmare:
1) You are driving paid traffic to a product page with good ROAS because your ads are optimized after months.
2) Then the competitor submits a fake DMCA
3) Your page is content is removed automatically by Shopify (no check on their end)
4) Your optimized ads are now directing customers to an empty page
5) Then you risk having your ads accounts flagged
You might think you could just redirect the url to a duplicate?
That would be a great solution indeed. BUT
6) The competitor is monitoring your product page and immediately submits another fake DMCA
7) You redirect again
8) Then another fake DMCA
Now WAIT for the cherry on the Cake.
9) After 4 or 5 fake DMCA, you Shopify account admin is locked
10) You may even get suspended automatically by shopify
11) Then you can no longer fulfill your orders from customers
12) Then your customers start making chargebacks
13) Then the payments processors on which you've built a reputation over the years can suspend you
14) Shopify (zero)support asks you to reply to the DMCA with a counter notice and WAIT for the Trust & Safety Team to get back to you
15) Then you receive an auto reply from them telling you that the other party has been informed and that you may repost your content in 2 weeks unless the scammer has filed a court action.
16) Then after 2 weeks, you repost your content
17) Then the same or the next day, the same scammer (watching you like a hawk) issues another fake DMCA for the same product page through the same fake account.
ISN'T IT BEAUTIFUL?
**
Shopify could literrally STOP 80% of these false DMCA by just adding a field on their rudimentory form: asking to verify a phone number or upload an ID but NO, that seems to be a task too Challenging for a billion dollar company like Shopify
This could easily be fixed by having a consequence to fake DMCA reports. A fine or something. Something messed up that Youtube does is when someone files a copyright claim they get to take the money from your video. I have had times when I was sent a copyright strike for videos I have made and when I check my video is usually much older than the music that I am being accused of stealing.
I always win these disputes for obvious reasons but Youtube has never offered to make things right.
Basically the accused is guilty until proven innocent which is not how it should be.
Part of the problem is many of these companies have a faux DCMA process.
Rather than use the “official” DCMA process, they build a reporting mechanism in front of it. People can abuse it without facing federal repercussions.
Wait, is youtube still doing that? I thought they moved to putting the money in escrow until the dispute was settled, which cut down on the majority of the false claims?
It’s been a couple years since my last attempted strike but it’s good to hear that they are trying solutions. Maybe that’s why I haven’t gotten one in a while.
Isn't a fake DMCA report "lying for profit" AKA fraud? And it's done on a computer making wire fraud, a federal crime. AFAIK this is already against the law and can be prosecuted, no need to add a new law for fines.
I’m pretty sure they’re not legally allowed to add extra requirements like this. If they were, sites could make it almost impossible to take things down by adding a lot of requirements
As I learned from this comment [0], that kind of is a requirement. Section 512 requires a notice contain:
> Information reasonably sufficient to permit the service provider to contact the complaining party, such as an address, telephone number and email address
I think another poster hit the nail on the head, when they said (and I'm paraphrasing), these aren't real DMCA takedowns. They're a different system these hosters bend to that acts outside of it. By being more liberal in what gets shut down, they comply with DMCA which is much more onerous on what is required to shut things down.
In fact, if this was really the DMCA process, people would have legal standing to go after the accusers via Section 512.
Further, I'm also pretty sure you don't have to take things down right away. You do have time to investigate the alleged violation and decide whether you are in the right.
Basically, all these players are hiding behind systems they put in place so they wouldn't come close to DMCA issues or have to pay anyone to investigate notices.
This is currently being debated in the highest court of the USA, right now as we speak. It’s pretty dumb and absurd to us techies but apparently it’s not a settled matter in law.
> Information reasonably sufficient to permit the service provider to contact the complaining party, such as an address, telephone number and email address
The words "information reasonably sufficient" set the bar of compliance. The list is an example, not an explicit requirement; hence the words "such as".
In this context, I think "and" could be reasonably argued to mean either "and" or "or". And given that the purpose of the contact information is "Information reasonably sufficient to permit the service provider to contact the complaining party" and not for identity verification, I'd guess that no company is willing to go to court to test this.
But is "and" being used to form a set of acceptable forms of identification (any of which would be sufficient), or is it joining the forms of identification together as all being required?
It's joining them all together, but the whole thing is presented as an example. Any avenue that provides the ability to contact presumably meets that standard.
If Shopify were to alter their contact info requirement, it would clearly be with a different goal in mind than the one laid out in that sentence. That's assuming of course, that the current requirements have actually led to communication.
To blindly listen to anonymous reports on the internet is not good faith. Shopify has a duty to handle these in good faith and it’s infinitely foreseeable that this bad faith conduct could result in serious damage to their customers (as you explain).
We've been harassed by the same scammer that placed 7 fake DMCA over the past 2 weeks and Shopify automatically removes you content with zero check. That dude just mentioned the product pages he wanted and used totally irrelevant links as a source for what he claims was the original content even a .xyz domain.
Shopify support does absolutely nothing even if you send them 20 emails.
Hundreds of legit businesses are affected by this loophole exploited by scammers.
My sister’s Etsy shop (which is her family’s sole source of income) was hit by this scam last Christmas during the holiday rush. DMCA takedowns from copycat vendors to wipe away competitors to (temporarily) steal business.
It’s not a problem Etsy or others take seriously, even when the attack hits stores with multi-million dollars per year in sales.
It’s yet another case of a tech company refusing to staff enough workers to manually review and verify things that computers shouldn’t be doing by themselves.
I agree that tech companies have nonexistent support because they refuse to hire people to do it, but the DMCA is a terrible law and enables people to do this way too easily. It should cost money (or something) to file a DMCA claim to prevent copyright trolls and people from abusing the legal system.
Tech companies _cannot_ do anything. The DMCA is a terrible law, and requires that the company receiving a DMCA take down immediately take the claimed URLs down, or become liable if the claims aren't fraudulent. I recall originally it sounded like not only was an immediate take down required, but also they had to keep it down for two weeks even if the victim files a counter claim showing that it's BS.
The law is designed specifically to enable this, because the recording industry bought a bunch of politicians, and said "hey we need to have a zero cost zero risk mechanism for us to remove content we believe to be pirated, without having any penalties for being wrong, but there must be penalties for the host services if they try to stop us".
I like this. Large deposit with some gov agency and the money gets awarded to the victim (or refunded to the claimant) when the dispute is finally resolved. You don't submit a claim unless you're really sure that you'd win.
This is how the DMCA should have been designed in the first place.
The app we're currently releasing does not have an auto-signup. Each account signup request is reviewed manually.
Needless to say, this is difficult to scale. It's fine for our app; which Serves a fairly small demographic, and does not make money, but it would be a big problem for major-league outfits.
I have a feeling that one of the first commercial applications of AI, will be moderation and response to customer complaints.
> I have a feeling that one of the first commercial applications of AI, will be moderation and response to customer complaints.
> We'll have to see how well that works.
Worth noting that almost all of YouTube's front facing customer service is heavily, heavily automated, which is a source of constant hostility and aggrivation for YouTubers, in that they must interact with these ML agents, get a stupid, nonsensical answer, then reply again to get a human reviewer who (usually) resolves the situation that would've been faster resolved by a human in the first place and let to less frustration for all involved.
I think it's a fantastic idea to not automate the process of signups for a service, and yes, it will substantially slow down the process. But I think everyone has just about had their fill of all these services that produce bad outcomes extremely quickly when a trivial amount of human involvement would've handled whatever issue better, in a perfectly acceptable time frame.
If you don't want to have a customer service department that is well staffed and well paid to handle customer complaints/problems/issues/what have you, then my suggestion is do not involve yourself with or build a company that will need one. Simple as.
Edit: And like, I think it's fair to extend this to all these sectors of a normal, functioning business that silicon valley companies are happy to "automate" to cut their costs:
- If you don't want to provide the resources for human approval for your platform, then don't start a platform.
- If you don't want a provide the humans to moderate content, do not host content.
- If you don't want to handle verifying ID's and driver's licenses, running background checks, and all the rest that is Bog Fucking Standard for a cab company, then don't start a cab company.
It just blows my mind how many of these business have gone into the field of X or Y and just... not done or half-assed huge aspects of those fields, and then shrug their shoulders when people get pissed that they can't get help. Do you think all these dinosaur businesses you're trying to disrupt went out of their way to find an excuse to hire on customer service agents for the hell of it? No! They needed people to solve problems for customers with the product they sell!
I'm bearish on AI/ML in the customer support space. It's been terribly easy to trick these tools.
Maybe instead of moderation and response, it would function better as a blackbox middleware to the customer support onboarding process. So instead of tackling the problem at the customer level, you use AI/ML to enable your existing support staff to scale their workload.
The picture I have in my mind is a pre-processor for new tickets that would add a series of tags to a support request. That would allow your support staff to be assigned to tags where they specialize. This simplistic implementation would probably integrate into existing service management platforms without having to stand up Yet Another B2B.
>but it would be a big problem for major-league outfits
Not only is this a lie you've been willingly sold, it appears you've bought it and taken it home.
This said I don't think you should feel too bad, we the consumer buy these same lies all over the place "If we have to follow these regulations then it's going to be a big problem for the industry".
At the end of the day some of the problems do not scale easily and cheaply. In these cases business have the choice of willingly being good societal actors and taking a loss of profit in order to prevent issues, or by means of regulation be forced to behave in a fitting manner.
> Not only is this a lie you've been willingly sold, it appears you've bought it and taken it home.
That was unkind. I didn't think we behaved that way, here, but I'm often wrong...
It certainly can be done, but that usually means the shareholders need to hold off on that second lambo. That doesn't often play well, in the boardroom.
I don't think OP was unkind at all; maybe a bit blunt, but telling someone you think they've been lied to and duped and that they're a victim is not unkind in my opinion. Powerful entities have been lying and duping common people for... well, probably for all of human history. The OP may or may not be correct, but from my reading of their post, I don't think they intended that as an attack of any kind, but rather a commiseration.
Well, maybe short term. But this looks like something that can very well kill them in the longer term by making their platform so toxic no one would want to touch it.
I think they legally have to allow anyone to submit these requests and act on them immediately otherwise they could be held liable for copyright violations
> Provided the notification complies with the requirements of Section 512, the online service provider must expeditiously remove or disable access to the allegedly infringing material, otherwise the provider loses its safe harbour and is exposed to possible liability.
> Information reasonably sufficient to permit the service provider to contact the complaining party, such as an address, telephone number and email address
>> Shopify says that it’s not feasible for the company to investigate the validity of all takedown notices in detail. As such, these false claims resulted in actual removals and the affected stores also received strikes on their accounts.
Unlike the DMCA takedown, the "strike" is by no means required by any laws. That's just Shopify choosing to penalize takedown recipients without doing any investigating.
The idea of "strikes" is pretty standard in the US from the 90s era "3 strikes" crime laws (i.e. commit 3 crimes and you get life without parole; these have generally been repealed in the last 5-10 years or so) that were inspired by the rules of baseball where the batter is out after 3 unsuccessful swings or hittable pitches he didn't swing at.
It isn't a good system but Youtube in particular was loaded with pirated content in the early days before Google invented automated copyright moderation so they most likely needed to do something drastic to delete pirate accounts to avoid legal trouble with the copyright lobby and everybody else just copied the idea as a "best practice".
It is kinda required. The DMCA requires you to implement a policy for terminating repeat copyright infringers. You aren't required to call them "strikes", but you have to keep track of who is a repeat infringer and terminate their account. You have some leeway in how you implement it, but you do need to take action against repeat infringers. As such, any infringement that doesn't get overturned will end up as a strike against you (whether it's called a strike or not).
Then change the form and make the address, email address and phone contact mandatory. If they want to file a DMCA through a less inconvenient mechanism, let them.
> Then change the form and make the address, email address and phone contact mandatory
That's the context I was referring to. If Shopify requires an address and phone number from DMCA claimants, then won't the DMCA claimants simply sue Shopify for making the process too restrictive?
It looks like Shopify has decided not to play that specific game.
Hmm, I think the law is bad here, though there might be relevant precedent that a lawyer can point to:
>Information reasonably sufficient to permit the service provider to contact the complaining party //
The complaining party is a party with a [genuine] complaint of copyright infringement. A specious claimant is not really a complaining party. Whilst an email address may allow you to contact a party, it's not really sufficient to contact a [genuine] complaining party; the email address will put you in touch with lots of correspondents who are not complaining parties but instead are vexacious complainants.
Requirement for a notarized affidavit delivered by registered mail would seem like it would be no bar for genuine complainants who were subject to a loss that the court, or public process, should care about.
That might be too onerous? Maybe parties should prove they have registered the copyright in their works at the USPTO, as they would in an infringement proceeding?
This situation seems like what happens when you let corporate interests write the laws and you just sign them in.
That sentence defines the need for contact information in explicit respect to contacting the complaining party.
That does not answer whether Shopify is free to use more information to filter out fraudulent claims; though my understanding is that Shopify is not really allowed to filter out fraudulent claims at all, so that point is moot.
You do not have to register your copyright unless you intend to go to court. The DMCA should be a first notice of "I think you are offending, take this down so we can all avoid the trouble and cost of court". If you refuse to take things down than the claimant should get a lawyer, register the copyright, and go to court. However as first notice I want the DMCA take down to be simple and cheap for both parties if the guilty admits their mistake and fixes the problem. In generally nobody registers a copyright until just before they open a court case (you get triple damages for everything after the registration)
Where the DMCA went wrong in my opinion is it should have been upon notice you have one business day to notify whoever posted the content, and they get one business day to respond. If they take down the offending content, then all is well (the amount you could gain from two days of something being up isn't worth your court costs) If there is no response at all, only then is content taken down automatically. If there is a response that the content doesn't infringe then content stays up, but the response must contain full legal contact information: the accuser is required to go to court to get the content taken down. Note that court mean the count of law in the country the accused lives in.
The above is how normal cases should work. There are some tricky things that need to work different. If the accused is in a country that doesn't recognize copyright, if the legal contact information is invalid, if the accuser is submitting many false claims, or the middle party doesn't do their part there are problems. I'll post my ideas, but there is room for better ones and a lot of details need to be worked out.
If accused party is in a country that doesn't recognize international copyright (China being the big example), then take downs need to happen immediately - we cannot trust your courts to protect our people so you in turn are assumed wrong, take it up with your government. (I can't think of anything better here - I'm open to better ideas)
If the legal contact information given in the response is invalid, then the entire account should be terminated and all content deleted.
Many false claims needs to be legally defined. I'm going with the copyright owner and (not or!) their authorized agents send more than 100 take downs that after 1 year are still up and no court case has been filed.
If the middle party (youtube) doesn't react to court orders to take things down, or to the poster asking to take infringing content down then they are in trouble. Otherwise they are only a middleman.
No, any form that sites provide are purely a convenience and are not required by law. Legally all you need is contact information where a lawyer can mail a DMCA complaint.
The DMCA does not allow 1, and they risk the possibility of legal liability if they do 2. The DMCA has no concept of fraud prevention builtin, and any efforts a web operator takes to prevent fraud fall into a legal grey area that could increase the operators liability. This is either a flaw in the DMCA, or a.feature, depending on who you are.
There are plenty of solutions within the law that aren't a grey area.
As an example, send a DMCA/C&D to shopify.com - do they have to take down their homepage? Of course not. Did they set up a robot to take down their homepage when it receives a request? Of course not.
> DMCA has no concept of fraud prevention
Yes it does. The safe harbor requirements require that material removed expeditously upon proper notice. A bogus notice is not proper notice. Acting on a bogus notice is against shopify's own notice and takedown procedure. They are creating more harm for themselves by doing this.
The difference with the home page is that it isn’t subject to safe harbor like third party content assuming the home page doesn’t contain user content (I haven’t actually looked). Shopify is already civilly liable for copyright issues on their home page. Since they have no safe harbor, there is no reason to follow the safe harbor rules.
But if the recipient gets it wrong and misclassifies a proper notice as a bogus notice, they are liable under the law. There is no safe harbor for "we acted in good faith but we were wrong".
The real problem is Shopify is not the one who should receive this notice. They should act like a post office and pass it onto their users. Shopify should only do more if the poster does not respond in a reasonable amount of time, or if a court order is obtained. (Or of course the user takes things down). Of course a user responding to leave content up needs to provide their legal contact information so this can go to court.
But if they don't take it down as soon as they are "notified" then they lose the safe harbor protections. I find the DMCA enraging but the law is pretty clear. You incur some liability by not taking down immediately, and no lawyer or business is going to incur liability when they don't have to.
To the implementation established through corporate lobbying of media corporation over the last century; yes, I'd very much agree.
The core is, people who create useful works deserve to make a living and not simply have the financial benefit of those works go to others who are rich enough to exploit them.
I'd be interested if you don't agree with that core, if you could explain why?
Monopoly is not the route we should be taking to achieve that. Copyright only manages to support a select few artists. It does so by making it more difficult to create art in the first place!
The overwhelming majority of copyright benefit (in dollars) goes to "others who are rich enough to exploit" the system itself, not to the artists who are in need of "a living". Those absurdly wealthy groups make it more difficult and more expensive for artists to compete with them. The tool they use for anti-competitive behavior is copyright. Copyright is only useful for anti-competitive behavior, because copyright is literally defined as monopoly.
---
If we did not have copyright, people would still be free to financially support artists. There would only be two major differences:
1. Artists would not be able to compel people to financially support them.
2. Giant corporations would not be able to compel artists to give them monopoly over their art. Artists would be entirely free to create new art, and to seek direct financial benefit from that art.
I argue that the second effect is by far the greater of the two, particularly in respect to artists who need a living wage.
>The overwhelming majority of copyright benefit (in dollars) goes to "others who are rich enough to exploit" the system //
Yes, in the copyright system we _have_ in USA/Europe (probably globally). But fundamentally copyright was birthed out of protecting creators of works (authors of books in the case of Queen Anne's Statute). Capitalism is fundamentally corrupting of every system, but that doesn't mean the "core" of the underlying system was bad -- that is all I argue for here, that a just copyright system that actually encourages sharing of works could exist, modelled on the fundamental core of copyright which is protecting authors/creators from predatory industries and maintaining things such as right of authorship (being acknowledged as author).
I am not opposed to intellectual property. (And, in fact, create it: I'm a programmer.)
The problem with the DMCA is that in practice it's ridiculously lopsided. It's *supposed to be* under penalty of perjury but no harm comes to those who make false claims.
I'm just curious; should I be able to profit from selling high-quality prints of a living artist's paintings that I surreptitiously photographed in a gallery?
We can debate the exact length copyright should be (I favor 25 years), but that is the real question. If the copyright is expired, then you can, if not you cannot. I don't care if the artist is living or dead. A dead artist may have heirs (kids, wife...) that they expected to still be supporting except they happened to die.
> they risk the possibility of legal liability if they do 2
Have any companies been successfully sued for attempting to police DMCA fraud on their platform? Be interesting to know how courts treat good-faith efforts and if there have been instances where a company just phoned it in and got in trouble for half-measures.
As far as I am aware, companies are largely very risk averse to this sort of thing and no e have yet made any effort to combat fraud. In fact, I am pretty surprised to hear about Shopify even taking this step.
I doubt the DMCA provides a provision for seasoning. IIRC, there is cause for suing someone over misrepresentation and customers should have the ability to file counter notice before having content removed.
However, companies often have policies that are more restrictive than DMCA (YouTube, for example, will remove things that the DMCA would not require).
False copyright claims are a civil matter. A fine equal to a fifth of the last five years of annual income and wealth increase (eg share value accrued, house value accrued), plus restitution of all actual losses, plus payment of court fees should be sufficient.
Prison is a massive cost to society, we should avoid it except where it's entirely necessary.
False DMCA claims are perjury, which is (at least in some states) a felony.
The rationale is that the DMCA gives complainants the ability to restrict others' speech, and so the law wants to strongly disincentivize abuse of that power.
In any case, I didn't express it well. What I intended was that, in my personal opinion, these matters should be treated as a civil matter in view of copyright essentially being a tort.
USA Americans seem to contort the notion of speech beyond recognition. Being prevented from publishing a video someone else created, say, doesn't inhibit your right to express any opinion (aka freedom of speech). But of course, disincentivising abuses of power is always good.
Yes. DMCA claims are made under penalty of perjury.
> Being prevented from publishing a video someone else created, say, doesn't inhibit your right to express any opinion (aka freedom of speech).
Indeed, the inhibition of free speech is when you are prevented from publishing a video you created, because someone else falsely claims that they created it. Hence the felony.
Not running afoul with laws, regulations and other large companies is more important to them than keeping one person's business online. DMCA in general is flawed. It forces companies to be judge, jury and enforcer on something that should be a lawsuit or criminally enforced. The truth is its impossible to police so we ended up DMCA out of desperation from big business mad they might lose a few dollars.
Because if Shopify gets the take-down request and ignores it, they are now potentially liable for the infringing content remaining available after the complaint has been made. Reviewing each complaint requires billable hours for someone, and just blanket accepting every one and removing the content is easily automated and incredibly cheap. And because they're so big and handle so many websites, even if every complaint is invalid and bullshit, the stores that get nuked in the process won't hurt their bottom line as much as a potential IP lawsuit would.
I had a business a decade ago where our competitor hired someone to consistently email PayPal and tell them we were actually selling CSAM. PayPal had to act every time by immediately blocking our account and putting us out of business for a week while they checked and saw that we were not in fact selling anything untoward.
If you're an asshole there are many methods online to really screw with your competitors :(
The sad part is that legitimate users are the ones being affected when the content actually infringes on other persons right. There must be a faster way to counternotice without the lose of revenue in case something untoward is happening.
There are many people who make their livelihood by drop-shipping products from China.
Since the products are identical the only way of succeeding is being the store that the customer sees first. Some invest heavily into SEO/Paid Ads. Others try and harm competition like the DCMA requests and leaving negative reviews.
But one thing I've seen first hand is that the competition is brutal. Everyone just steals website content, images, marketing material etc from the competitors. Whatever they can do to get ahead.
I’m sympathetic to content creators generally but I find it hard to sympathize with drop-shippers - it seems like such a silly, useless and unnecessary business. Perhaps there’s an analogy to High Frequency Trading though - they “provide liquidity at the margin” for crappy goods?
Drop-shipping DID have a purpose back in the 80-00s when you couldn't buy the stuff one off from platforms like AliExpress, but most dropshippers aren't doing it like the drop-shippers of yore did it.
Starting the "back in my day rant": In the past, you would actually negotiate with your manufacturers about how many items you would purchase over the year, and that would determine the discount you got on their SRP. You could then mark it up, but for the most part you didn't, and your profit was your discount.
Now days, though, everyone is just plugging directly into AliExpress/Temu/Wish, and marking it up. They are just a lazy low/no-value middle-person between me and cheap knockoffs from China. The "clever" ones are worse, because you might not even realize you are buying a knockoff. The packaging, images, titles, and descriptions will be 1:1 with the real product.
While this is true of people only doing drop shipping, my company does a lot of drop shopping and JIT delivery. We have our own inventory, but we also have contracts with vendors that are not available to consumers.
We are also better equipped to guarantee supply chain integrity and have much better customer service. Getting a warranty on a part can be difficult, but we have dedicated people who work with vendors every day. All the customer has to do is stick a shipping label on box.
Philosophically, how do drop-shippers compare to, say, furniture shops that only have showpieces on display. And when you pick a piece, they pass along the order to the manufacturer to deliver.
Or for that matter, car-dealerships operate on the same model.
You raise a interesting point that is lost on a lot of anti wall street folks. Imagine if consumer products had Reg NMS.
No matter where you go and buy a fungible consumer product (SKU), the retailer was regulated by law to give you the best price available anywhere in the world at that exact microsecond.
1. Identify a thriving niche, and stores within that niche.
2. Copy everything they do. Steal their photos, descriptions. Everything, really. Launch a bunch of stores that sell the exact same thing, with exact same listings, etc.
3. Attack the store with bogus DMCA takedowns, deluge them with fake reviews.
4. Boost your own store(s) with fake positive reviews.
5. Replace the stores you are competing against.
If you're located in China and can do that with impunity, what are the owners going to do? Amazon seems to be happy for any business.
> literally anyone can open a fake account and submit a DMCA to take down
Doesn't the law require this?
> Shopify automatically removes you content with zero check
Doesn't the law also require this?
AFAIK, the only thing the DMCA allows is for them to put the content back after you swear to them that it's not copyright infringement. But it's not very clear how much evidence they should request.
> after you swear to them that it's not copyright infringement. But it's not very clear how much evidence they should request.
Should they? I thought the whole point of DMCA Safe Harbor is that third parties (such as providers and hosting services) don't have to figure out what's infringing and what's not but merely accept statements from all the involved parties.
If removal on notice is one-click process, restoring on counter-notice could (should!) also be an one-click process.
And if DMCA doesn't prohibit proactive counter-notices, marketplaces that care about their sellers should let them preemptively swear that all their products are not infringing on anything, and save the counter-notice, so takedown requests would be immediately served with "we took the content down for a femtosecond (as legally required), but we got this counter-notice, so we restored the content - now you can go to a court". Not sure how legal this could be, but giants like Etsy's or Amazon's legal teams surely can test those waters if they would want to stop the DMCA abuse.
The solution to it all is discourage false reporting by imposing penalties.
The reason for frivolous lawsuits, DMCA takedowns, rape accusations, fake news and internet misinformation is that people feel no practical consequences for false reporting.
If you are caught with a blatantly false report you should be faced with a stiff penalty.
The penalty should be just enough so that normal people feel twice before false reporting. This should discourage 99.9% people from false reporting and then allow authorities to assign adequate resources to deal with the rest 0.1%.
This can have a chilling effect on the reporters - remember that small businesses and large corporations are on both sides of this interaction at different times. If Sony BMG can just wholesale steal your music from bandcamp and threaten to drown you in legal fees if you dare formally file a DMCA it'll lead to a lot of pain and marginalization for some people - just like the ability for anyone to make these complaints with very little verification causes pain for other folks.
I personally think that DMCA was terribly designed - but a replacement would need careful consideration from all sides.
I also think DMCA is terribly designed, but it is no wonder -- it was designed by and with only interest of copyright holders.
As to chilling effect on reporters, I disagree.
We have laws to protect people from libel, and yet it does not have chilling effect on reporters. And yet it is essentially exactly the same thing I am thinking about.
The damage from libel is to an individual. The damage from misinformation is to entire society.
The reason libel laws work is that the number of cases isn't overwhelming the court system.
But if you rewind time 20 years and you suspended libel law for a moment, if everybody could publish in a newspaper or TV whatever they want about whoever they want. What do you think would happen? A deluge of false information about everybody.
What would people say about somebody wanting to introduce a libel law? That "it would have a chilling effect on journalism".
> We've been harassed by the same scammer that placed 7 fake DMCA over the past 2 weeks and Shopify automatically removes you content with zero check.
So much for " The e-commerce platform typically receives thousands of takedown notices per month from rightsholders, which are in part processed automatically. That works well in most instances, but not always."
As a person who's built at least half their career on Rails, I want Shopify to succeed in a big way, but every "platform" acts the same way. What's worse is that I've read other stories from Shopify creators who are having their stuff directly ripped off, and their complaints have no effect.
Why do the people running platforms all wink and nod, and agree to enshittify the service for the creators in the same way, leaving them nowhere else to go? Seems like there's an opportunity to make more money by breaking the mold, and being a platform that favors creators over complaints.
Ahhh, don't worry; I'm sure AI will sort all of this out shortly. <eyeroll> Anything to avoid hiring actual people with brains to sort these kinds of things out.
If these automated systems are so great, you'd think there'd be relatively little gray area for human beings to sort out, but no one seems to actually do that, until a stink is made on some social media that the company can't avoid.
(thanks to the OP for shedding light on this situation)
Running a business through Shopify has become very risky these days.
We are now working on transferring our online stores to Woocommerce where it is far more difficult to get your content removed with bogus DMCA.
Picture the situation to understand the Nightmare:
1) You are driving paid traffic to a product page with good ROAS because your ads are optimized after months.
2) Then the competitor submits a fake DMCA
3) Your page is content is removed automatically by Shopify (no check on their end)
4) Your optimized ads are now directing customers to an empty page
5) Then you risk having your ads accounts flagged
You might think you could just redirect the url to a duplicate? That would be a great solution indeed. BUT
6) The competitor is monitoring your product page and immediately submits another fake DMCA
7) You redirect again
8) Then another fake DMCA
Now WAIT for the cherry on the Cake.
9) After 4 or 5 fake DMCA, you Shopify account admin is locked
10) You may even get suspended automatically by shopify
11) Then you can no longer fulfill your orders from customers
12) Then your customers start making chargebacks
13) Then the payments processors on which you've built a reputation over the years can suspend you
14) Shopify (zero)support asks you to reply to the DMCA with a counter notice and WAIT for the Trust & Safety Team to get back to you
15) Then you receive an auto reply from them telling you that the other party has been informed and that you may repost your content in 2 weeks unless the scammer has filed a court action.
16) Then after 2 weeks, you repost your content
17) Then the same or the next day, the same scammer (watching you like a hawk) issues another fake DMCA for the same product page through the same fake account.
ISN'T IT BEAUTIFUL?
** Shopify could literrally STOP 80% of these false DMCA by just adding a field on their rudimentory form: asking to verify a phone number or upload an ID but NO, that seems to be a task too Challenging for a billion dollar company like Shopify