Minding the Value Gap: The UK Parliament’s Report on Disinformation and Fake News

It’s been a rough couple weeks for Silicon Valley in Europe.  Hard on the heels of an embarrassing lobbying loss in the European Parliament with the Copyright Directive (aka “Article 13”), the UK Parliament issued a damning report on the failings of social media.  The title tells you a lot:  Disinformation and Fake News.  Headline readers will come away from the news reporting with the impression that the report is about the UK government regulating Facebook due to the manipulation of the platform by Russian trolls using active measures.  If you read the report, even just the summary, you will see that it is the work product of an eight-nation committee and it is targeted at all social media platforms and “user generated content” (or “UGC”).

Unlike US-style regulation that these companies simply ignore and pay the paltry fines, it is unlikely that Google, Facebook and others will be able to simply conduct business as usual in the UK or Europe (Brexit or no)–particularly when you see statements like the following from Tom Watson, the Labor Party’s Shadow Culture Secretary:

Labour agrees with the committee’s ultimate conclusion: the era of self-regulation for tech companies must end immediately. We need new independent regulation with a tough powers and sanctions regime to curb the worst excesses of surveillance capitalism and the forces trying to use technology to subvert our democracy.

Few individuals have shown contempt for our Parliamentary democracy in the way Mark Zuckerberg has. If one thing is uniting politicians of all colours during this difficult time for our country, it is our determination to bring him and his company into line.

Tom Watston BASCA

Considering that the corporate bot farming techniques and the corporate comms version of Marcuse-esque messaging that Google and Facebook used against Article 13 are even more insidious than the Russkie election manipulators who were the focus of the Parliamentary report, it’s all pretty breathtaking.

They’ll Take Us in a Rush

Corporate whack a mole is–or was–the ultimate de facto safe harbor and is at the heart of the value gap.  Two truths were obvious from the moment in 2006 when a lawyer from Google’s recently acquired YouTube told a bar association meeting in Los Angeles that they could either make a deal with her for weaponized UGC on YouTube or play whack a mole using the DMCA notice and takedown–that Google and their shills intended to fight every step of the way (see Ellen Seidler’s excellent take down of the take downs).

First, it was obvious that Google executives intended to enrich themselves building a business on the backs of artists and songwriters who couldn’t fight back (what I call the ennui of learned helplessness), and next that Google intended to use those grey market profits and their vast wealth from public markets in a particularly nasty way that would have made Leland Stanford proud.  Google would simply crush any opposition from any rights holder or competitor who stood up to them.  But most of all that UGC is the ultimate front end for the data profiling back end which is where the real money is made.

This 2006 display of corporate molery had special resonance for me.  I spoke at the OECD’s Rome conference on digital stuff early in 2006 where Professor Terry Fisher and Google lawyer-to-be Fred Von Lohmann essentially laid out the strategy of using UGC to overwhelm the system and the abuse of the safe harbors.  That strategy was at the heart of their humiliating loss in the Grokster case and should be seen as implementing Grokster by other means (recall that Fred did a first rate job of articulating the losing argument before the 9th Circuit Court of Appeals that carried the day in the 9th Circuit but failed where it mattered in the U.S. Supreme Court).

During a very spiffy dinner that probably cost enough to have provided fresh water to a million in South Sudan, Professor Fisher told the slightly boozed up crowd of bureaucrats how the world was going to work with UGC.  I was very likely the only one in the room who knew enough about Fisher and Von Lohmann and about Google’s tactics to really get the message.  I whispered to my dinner partner, “They intend to take us in a rush.”

And so they did.

Platforms Are Fit for Purpose but Their Purpose Isn’t Fit

The Parliament’s report on Disinformation and Fake News is a strong rejection of Silicon Valley data miners like Google, Amazon and Facebook.    (You could say a latter day Big Four, but the Big Three won’t let there be a fourth in the best traditions of the Big Four.)

Google is a thought leader among the aristocracy of Silicon Valley’s real-time data miners and subsidizes many other pundits who support its business model in a variety of ways.   It’s not surprising that Facebook followed the path that Google blazed with YouTube since Google got so rich doing it.

In many ways, Facebook is the ultimate UGC profiteer–and blissfully chose to largely ignore the moral malaise that UGC will inevitably bring with it.  Zuckerberg, Paige and Brin ignored these problems because The Boys Who Wouldn’t Grow Up were making too much money–and getting away with it.  The fundamental problem is that these companies care more about enriching themselves than they care about who their users are or the content their users generate–as long as users keep generating.  It is that greed that underlies the studied lack of control designed into Google and Facebook.  It’s not that bad guys exploit a design flaw–it’s that the platforms work exactly as planned.

Nowhere is this more obvious than with the failure of Google and especially Facebook to manage their platforms to prevent bad actors from using the very tools that enriched the Silicon Valley monopolists for very odious disinformation campaigns.

Despite repeated warnings, governments have allowed these nation-state level actors to play their whack a mole game so freely that by the time democracy itself was on the line it has been difficult to regulate the monopolists.

Until now–or so we hope.  The UK Parliament’s Select Committee on Digital, Culture, Media and Sport has rendered its final report on “Disinformation and Fake News.”  While the report nominally focuses solely on Facebook, lovers of democracy should welcome the broader hope for both its methods as well as its findings.

The International Grand Committee

The Select Committee’s methods are refreshing:

We invited democratically-elected representatives from eight countries to join our Committee in the UK to create an ‘International Grand Committee’, the first of its kind, to promote further cross-border co-operation in tackling the spread of disinformation, and its pernicious ability to distort, to disrupt, and to destabilise. Throughout this inquiry we have benefitted from working with other parliaments. This is continuing, with further sessions planned in 2019. This has highlighted a worldwide appetite for action to address issues similar to those that we have identified in other jurisdictions….

[A]mong the countless innocuous postings of celebrations and holiday snaps, some malicious forces use Facebook to threaten and harass others, to publish revenge porn, to disseminate hate speech and propaganda of all kinds, and to influence elections and democratic processes—much of which Facebook, and other social media companies, are either unable or unwilling to prevent. We need to apply widely-accepted democratic principles to ensure their application in the digital age.

The big tech companies must not be allowed to expand exponentially, without constraint or proper regulatory oversight. But only governments and the law are powerful enough to contain them.

Let’s hope so.  In the face of well-financed resistance by some of the biggest corporations and the most devious robber barons in commercial history since the days of the Big Four railroads, our governments and law enforcement have pretty much failed so far.  That’s how we got here and that’s how the problem evolved well past private attorney general-type remedies.

The public attorneys general need to mind the value gap.  Hopefully the European governments have the spine to stand up and show America how it’s done.

 

 

The Elusive Obelus: Streaming’s Problem With Denominators

“How did you go bankrupt?” Bill asked.
“Two ways,” Mike said. “Gradually and then suddenly.”

Ernest Hemingway, The Sun Also Rises.

No matter how much people would like to deflect it, the unvarnished per stream rate is an ever diminishing income stream.  Given the number of calculations involved for both sound recording and song, it is likely that the total end-to-end cost of rendering the accountings for the streams costs more than the royalty earned on that stream by any one royalty participant.  Solving this problem is the difference between a short-term stock-fueled sugar high and a long-term return of shareholder value for all concerned.  So now what?

If you’re someone who receives or calculates streaming royalties, you’re already familiar with the  problem of the ever-decreasing per-stream rate.  The Trichordist’s definitive “Streaming Price Bible” for 2018 confirms this trend yet again, but simple math explains the problem of the revenue share allocation.

Remember that the way streaming royalties are calculated in voluntary agreements (aka “direct deals”) revolves around a simple formula (Formula A):

(Payable Revenue ÷ Total Service Streams) x Your Streams = Per Stream Rate

Which may also be expressed as Formula B:

Payable Revenue x (Your Streams ÷ Total Service Streams) = Your share of revenue

(Formula A and B are also known as “the big pool” in the user-centric or Ethical Pool models.)

Here’s the trick–it’s in the correlation of the rate of increase over time of the numerator and the denominator.  If you focus on any single calculation you won’t see the problem.  You have to calculate the rate of change over time.  Simply put, if the numerator in either Formula A or Formula B increases at a lower rate than the denominator, then the quotient, or the result of the division, will always decline as long as those conditions are met.  That’s why the Streaming Price Bible shows a declining per-stream rate–a contrarian fact among the hoorah from streaming boosters that sticks in the craw.

Services make these accounting calculations monthly for the most part, and they are calculated a bit differently depending on the service.  This is why the Streaming Price Bible has different rates for different services, rates that vary depending on the terms of the contract and also the amount of “Payable Revenue” that the service attributes to the particular sound recordings.

The quotient will also vary depending on the copyright owner’s deal.  If you add downside protection elements such as contractual per stream or per subscriber minimums, then you can cushion the decline.

This is also true of non-recoupable payments (such as direct payments that are deemed to be recoupable but not returnable, or “breakage”).  Nonrecoupable payments are just another form of nominal royalty payable to the copyright owner, and increase the overall payout.  And of course, the biggest nonrecoupable payment is stock which sometimes pays off as we saw with Spotify.  These payments may or may not be shared with the artist.  (See the WIN Fair Digital Deals Pledge.)

So each of the elements of both Formula A and Formula B are a function of other calculations. We’re not going to dive into those other elements too deeply in this post–but we will note that there are some different elements to the formulas depending on the bargaining power of the rights owner, in this case the owner of sound recordings.

So how is it that the per-stream rate declines over time in the Streaming Price Bible?

Putting the Demon in the Denominator

Back to Formula B, you’ll note that the function “Your Streams ÷ Total Service Streams” looks a lot like a market share allocation.  In fact, if the relevant market is limited to the service calculating the revenue share allocation, it is a market share allocation of service revenue by another name.  When you consider that the customary method of calculating streaming royalties across all services is a similar version of Formula B, it may as well be an allocation of the total market on a market share basis.

Note that this is very different from setting a wholesale price for your goods that implies a retail price.  A wholesale price is a function of what you think a consumer would or should pay.  When a service agrees to a minimum per stream or per subscriber rate, they are essentially accepting a price term that behaves like a wholesale price.

For most artists and indie labels, the price is set by your market share of the subscription fees or ad rates that the service thinks the market will bear based on the service’s business goalsnot based on your pricing decision.

Why is this important?  A cynic might say it’s because Internet companies are in the free lunch crowd–they would give everything away for free since their inflated salaries and sky-high rents are paid by venture capitalists who don’t understand a thing about breaking artists and investing in talent.  You know, the kind of people who would give Daniel Ek a million dollar bonus when he hadn’t met his performance targets, stiffed songwriters for years and gotten the company embroiled in multimillion dollar lawsuits.  But had met the only performance target that mattered which was to put some cosmetics on that porker and push it out the door into a public stock offering.  (SPOT F-1 at p. 133: “In February 2018, our board of directors determined to pay Mr. Ek the full $1,000,000 bonus based on the Company’s 2017 performance though certain performance goals were not achieved…”)

But long-term, it’s important because one way that royalties will rise is if the service can only acquire its only product at a higher price.  Or not.  The other way that royalties will rise is if services are required to pay a per-stream rate that is higher than the revenue share rate.  How that increase is passed to the consumer is up to them.  Maybe a move from World Trade Center to Poughkeepsie would help.

The Streaming Price Bible is based on revenue for an indie label that did not have the massive hits we see on Spotify.  In this sense, it is the unvarnished reality of streaming without the negotiated downside protection goodies, unrecoupable or nonreturnable payments, and of course shares of stock.  While some may say the Bible lacks hits, that’s kind of the point–hits mask a thousand sins.  Ask any label accountant.

Will Consumption Eat Your Free Lunch?

Let’s say again: The simple explanation for the longitudinal decline of streaming royalties measured by the Streaming Price Bible is that the rate of change across accounting periods in the “Payable Revenue” must be greater than the rate of change in the total number of streams in order for the per-stream rate to increase–otherwise the per-stream rate will always decrease.  Another way to think of it is that revenue has to increase faster than consumption, or consumption will eat your lunch.

What if you left the formula the same and just increased the revenue being allocated?  Services will probably resist that move.  After all, when artists complain about their per-stream rate, the services often answer that the problem is not with them, it is with the artist’s labels because the services pay hundreds of millions to the labels.

We don’t really have much meaningful control over what goes in the monthly payable revenue number (i.e., the mathematical “dividend” or numerator).  What kinds of revenue should be included?  Here are a few:

–all advertising revenue from all sources
–e-commerce transactions
–bounties or referral fees, including  recoupable or non-refundable guarantees
–sponsorships
–subscription income
–traffic or tariff charges paid by telcos
–revenue from the sale of data

Services will typically deduct “small off the tops” which would include
–VAT or sales tax
–ad commissions paid to unaffiliated third parties (usually subject to a cap)

Indie labels and independent artists may not have the leverage to negotiate some of these revenue elements such as revenue from the sale of data for starters.  Other elements of the revenue calculation for indie labels and artists will also likely not include the downside protections, subscriber target top up fees and the like.

And of course the biggest difference is that indie labels (at least not in the Merlin group who may) typically do not get nonreturnable advances,  nonrecoupable payments, or stock.

Is That All There Is?

Why should we care about all this?  There is a story that is told of negotiations to settle a lawsuit against a well-known pirate site.  One of the venture capitalists backing the pirates told one of the label negotiators that he could make them all richer through an IPO than any settlement they’d ever be able to negotiate.

The label executive asked, lets’ say we did that, but then what happens?  You say we should adapt, but you’re still destroying the industry ecosystem so that there’s nothing left to adapt to.  The most we could make from an IPO would cover our turnover for a year at best.  And we would be dependent on your success, not our artists’ success.

Then what?