Sunday, March 20, 2011

Mobile Handset OS Copyright Battles

Several years ago, IBM contested SCO Unix's claims that its copyrighted Unix code had been copied by IBM in its Linux distribution. It turned out that some 326 lines of code, most of which were part of header files, were copied. It was held that there was nothing copyrightable was copied because the header files did not have any executable code. Most of these were comments. If these were the only lines that were copied out of 700,000 lines in Linux, then there was insufficient similarity between Unix and Linux to sustain a copyright violation charge. Moreover, it transpired that Novell, not SCO, owned the rights to Unix; and the case fizzled out in 2007. From PC or desktop/ laptop Operating Systems, nowadays, the focus of major IPR battles has shifted to mobile handset software platforms.

Mobile platforms began with the Symbian platform running a Linux-based open source software stack, pioneered, maintained and made popular by Nokia, along with a host of other mobile handset makers. The Symbian OS runs exclusively on ARM processors. The Symbian platform was created by merging and integrating software assets contributed by Nokia, NTT DoCoMo, Sony Ericsson and Symbian Ltd. Over 385Mn handsets had been shipped with this OS loaded, till mid-2010. However, since then, Nokia and Symbian seems to have lost their way. This was clear from the recent Nokia announcement that it would go in for Windows Phone 7 platform for its platforms in future, which basically means that it is slowly ditching Symbian. Unfortunately, this seems to be an alliance of possible losers, though one can never underestimate the advantage of deep pockets that both partners have, and their common interest in building and owning standards.

In the meanwhile, there have been several other handset software alliances. Now, the game has become less about the OS itself, and more about the “ecosystem” of apps and app developers. A key facilitator of such an ecosystem is a friendly license that allows proprietary derivative software to be created from underlying open source software stacks.

Some time back, the Oracle-Google fight flared up, alleging that Sun-Java code had found its way into the Android Mobile OS. As is common, Oracle waited till it became apparent that Android was going to be a big thing; and then it raised the copyright infringement issue. This controversy is widely expected to be concluded very soon by a licensing deal between the two giants. This is on similar lines as the fight between i4i and Microsoft (see this earlier blog entry).

More recently, there have been reports (see this and this) hinting that Google Android App Developers run a risk that if Google turns out to have violated the GNU Public License (GPLv2) in creating Android Bionic library, then those who develop apps under Android Bionic library will suffer the same fate, viz., they will have to expose and make available their source code and become subject to GPLv2.

Let us try and understand what the controversy is all about.

First, let us explain what Google has done. It has taken a part of the Linux kernel header files (which contain macros, inline functions and comments, but do not contain any executable code) and stripped it off all comments and white spaces, by automated scripts. By doing that, Google says, it has stripped the header files of anything that is copyrightable, and has also eliminated many issues that lead to compilation failure. They contend that this creates “clean” header files that are no longer subject to the GPLv2 license, because they only contain a few types, macros, and inline functions. They contend that “Bionic comes with a set of 'clean' Linux kernel headers that can safely be included by userland applications and libraries without fear of hideous conflicts.” That script results in a type of C Library that is used by all app developers needing to access core functions of the Linux OS.

Those who say that Google is wrong, essentially insist that the headers are indeed copyrightable, because macros and inline functions (which are not stripped out by Google) have considerable amount of creativity and original expression that make them copyrightable; and stripping out comments and empty lines will not make the header files any less copyrightable. If a single header file, therefore, is copyrightable, it can never be said that a collection of hundreds of such files are not copyrightable. There is more copyrightable stuff in the structure, design, etc. In the entire collection, beyond what is copyrightable in each of the files. Therefore, the header files constitute copyrightable elements (under US law) of the Linux distribution, and hence are also subject to the GPLv2 license.

Those who say that Google is right rely on the US SC decision in SCO v IBM, discussed in the first paragraph, to hold that the header files, not being executable, and being stripped of any original content like comments, are not copyrightable.

If Google is wrong, at worse, all Google App developers (part of the Open Handset Alliance) will have to make their source codes available; which will blunt their competitive edge. At best, Google will have found a way to escape the rigours of the GPLv2 license, and essentially to “privatize” Linux. Currently, Google licenses Android Bionic under the Apache Software License, that is seen to be much more business-friendly than the GPLv2, as it is a more permissive license that is conducive to commercial development and proprietary redistribution. “Permissive licenses like the ASL and BSD license are preferred by many companies because such licenses make it possible to use open-source software code without having to turn proprietary enhancements back over to the open source software community”, writes Ryan Paul on the ars technica website. However, as pointed out here, neither the software nor the licensing is not so simple. Android is a complex open source project made up of more than 165 components, 80,000 files, and 2 GB of code under 19 different licenses.

The Google-driven Open Handset Alliance is not by any means the only Linux-inspired coalition of mobile handset manufacturers. Another example is the LiMo Foundation, whose mobile OS extensively uses open source technologies but some of its top-level APIs are said to be proprietary. But these proprietary components included in the LiMo Core will be available to 3rd party developers under a royalty-free license that covers both patents and copyrights. Among the Service-Provider/Handset maker alliances LiMo boasts of using its platform are NEC/NTT DoCoMo; Samsung/Vodafone; and NTT DoCoMo/Panasonic. Unlike several other alliances that are driven by a single company, the LiMo Foundation boasts of an independent governance and reflects the inputs and contributions of multiple industry stakeholders.

OpenMoko is another such mobile alliance with an open source software stack, whose software and license is said to be very business-friendly. Among its distributions is GameRunner, an Openmoko Linux distribution that aims to convert the Freerunner (a touch screen smart phone designed to run OpenMoko software) into a Linux-based handheld game console.

Friday, March 26, 2010

IPR in the IPL: Further updates

This post updates the following posts:

The third edition of the IPL has brought further moolah to all -- but most of all, to the BCCI itself. Consider: 
  •  Two more team franchises (Pune, Kochi) were auctioned, and fetched a price of $703 Mn (Rs.3200 crores) -- just a shade below what all the 8 teams fetched two years earlier.
  •  The main, domestic TV rights have been auctioned for about Rs.8,700 crores.
  • Mobile application rights for the next 8 years have also been sold this year for an undisclosed sum, to DCI Mobile Studios (a division of Dot Com Infoway Limited), in conjunction with Sigma Ventures of Singapore.
  • This year, BCCI sold broadcast rights to Youtube (Google) and to Britain's ITV.
  • Even the right to run and operate the official website of the tournament is has been sold and the minimum guarantee has been negotiated at US $50 million over 10 years.
  • In Britain, 0.5 million people are reported to have watched the IPL3 opening game alone, while 42 million people watched in India.
  • That it is an unparalleled marketing bonanza become clear when you see that women are estimated to comprise 38% of viewers and a full 45% are between ages of 15-35, the "spenders".
  • Some teams that have been promoted incessantly (like Shah Rukh Khan's KKR) have already become profitable, while others are still making a loss, smarting from the sudden shifting of IPL2 to South Africa for security reasons, leading to inability to earn revenues from ticket sales of home stadia. 
  • However, team owners are happy because they have been more than compensated in other ways -- the valuation of the IPL, and consequently of each team franchise, has skyrocketed. The IPL itself is valued at $4.13 Bn and there are reports of team owners who bought a franchise for under Rs.300 crores is looking for selling their stake for Rs.920 crores after just two years. Also, at least one team finds team ownership and other sponsorships as an ideal vehicle for surrogate advertising of liquor brands (liquor is not allowed to be directly advertised).

Thursday, December 24, 2009

Why is i4i's patent important?

i4i's patent which Microsoft has been convicted of infringing in Word 2007, is about a generational leap in the capability of computers to process data.
  • HTML as the first generation. If you wanted the author's name to be seen in bold, you used a descriptive tag thus:  bold text
  • SGML was the next generation: it operated with a special complement of descriptive "verbs" or "tags" and software that could process and interpret these efficiently. The advantage of such an approach was that different stylesheets could depict the same text differently. This promoted re-usability of data. Problem lay in the restricted number of tags.
  • XML broke this limitation of SGML; now, one could create one's own tags; browsers could render it so long as it was "well-formed", i.e. close-tags followed open-tags predictably within the document, and there was no open-tag without a corresponding close-tag, or vice versa. Further condensed and sophisticated logic could be imposed on the document structure through the use of rules of logic and structure embedded in Document Type Definitions (DTD) or Schema. This provided another breakthrough in terms of the range of applications -- no longer did one need standard ERP software in order to exchange data; XML coders and decoders did the job, and organisations could merely exchange xml files representing transactional data independent of database software. Thus markup languages made data interchange possible easily and cheaply. Many other applications were developed that made xml a development of nearly revoltionary proportions.  
  • However, with all these developments, tags (which are commands to the computer) were interspersed with the data. This meant that when reading the data stream, the computer had to first apply logic to gather whether each character it read was part of a data stream or a command. This slowed down the ability of the computer to read and process a document or an object, While this may not be apparent on the scale of data that most of us are used to dealing with, where there are mountains of data to process, this is a serious time-and-efficiency robber. 
  • This is where the elegant concept of i4i's patent comes in. If there is a way in which commands are interpreted independent of the content, the computer can read all the content at one go and process the content by implementing the commands in serial order. In other words, if all the commands in a data object ("file") were to be are found in one place and all content in another, the computer no longer needs to evaluate every character to find if it was part of a command or content.  This affords a huge, generational efficiency leap. For the same computing power, a lot more data can be crunched in much less time. In effect, this could make computing power cheaper by raising the efficiency with which computers process information/ data. 

Patent Infringement Case against Word 2007


On reading the patent No. 5787449 applied for on 2 June, 1994 and granted to i4i on 28 July, 1998, for a "Method and system for manipulating the architecture and the content of a document separately from each other",  and some MSDN literature relating to the "Custom XML" claimed to be a Microsoft invention, I observed the following.

  • The patent dealt with a method of keeping raw, unstructured data separate from its formatting or presentation-related information. This is different from what is understood as XML because an XML file content is structured, and not in raw form.
  • The patent application clearly differentiates the method from earlier standards including TROFF, RTF and SGML by showing that what they are patenting has no codes embedded in the contet, but instead has a content part, and a metacode map part stored separately. One could have multiple metacode maps acting upon the same content.  The content could therefore be literally anything. Thus, for consistent content that rarely changes, multiple re-use of the content using different metacode maps each serving different purposes, become possible.
  • This is uncomfortably close to what Microsoft calls as "Custom XML" on its MSDN library site. Indeed, way back in 2005, one of Microsoft's lead programmers blogged on an MSDN blog about the new "Custom XML" -- and if you read that, it becomes quite clear even to a relative layman that what Microsoft meant was clearly that it would put an "envelope" around any data (it could be a Word document, a spreadsheet or anything else) that would form part of a composite object, consisting of the envelope and any data (in this case, say, a Word file or Excel Spreadsheet) that is placed in what is called the "XML Data Store". The resultant object, which you and I understand as the MS Office 2007 document format, is called the "Office Open XML package". The advantages of this are expounded in the same blog entry. Brian Jones, the lead programmer, admits (gushes, actually) here (in 2005, remember!)  that for Microsoft, it is a new feature.
  • Both, Microsoft's Custom XML and i4I's patented method are not really about XML. Using this method to store structured, formatted, XML content is a subset of what the system can do. It can store binary (or raw) data equally easily as it can store structured text content.
  • Brian Jones' gushing about a new feature when it has been patented for 7 years is no different from the scathing, withering review of Bill Gates' book, Business @ Speed of Thought -- that Gates predicts the past. Much worse, while Gates only becomes an object of intellectual scorn to the reviewer, what Brian Jones and his ilk have done for Microsoft is to drive Microsoft into a legal patent infringement hole -- costing at least $290 million -- and that won't look pretty from inside Microsoft.
  • Brian Jones or others in Microsoft may have re-invented the wheel, but they cannot claim ignorance of the i4i patent, given that Microsoft has probably among the largest legal departments of any company in the world, and every product must be undergoing IPR infringement vetting before going to the market.
To conclude, I think the decision is fair, the concept was clearly patented, and the i4i patent was clearly infringed, albeit under new names of "XML Data Store" and "Custom XML".

Thursday, December 3, 2009

The US was also a major IPR pirate not so long ago!

The US just loves to paint developing countries like China and India in dark colours when it comes to respecting IPR, but here are a couple of articles written during the second half of the 19th century, when England was relatively more prolific in the arts and letters than the United States. It transpires that the United States was not much different from what it alleges that China is today. In other words, the US's own record in this regard has hardly been more impeccable.

In those days, there was no established international copyright code. Therefore, payments of royalties and recognition for foreign authors were not legally enforceable but were based on honesty and "courtesy of the trade".

In 1867, one James Parton wrote, "For forty years or more we have all been buying our books and reviews at thieves' prices... . . Can any one suppose that the proprieters like to see Blackwood and half a dozen other British magazines sold all over the country at a little more than the cost of paper and printing?" He chronicles several instances of authors unable to encash the success of their works, and makes out a cogent case for an International Copyright.

Then, in 1879, Arthur Sedgwick wrote, " ... piracy still flourishes as a profitable branch of trade. ... The attitude of the United States on the subject of copyright is more remarkable than that of any other modern country. ... It has ... studiously fostered international piracy, and refused to foreigners the benefits of its copyright law"

James Fallows, in a more recent article written in Dec 1993, suggests that cheating and cutting corners to get ahead, and then, once strong, advocating set rules of fair play and chiding other powers for failing to abide by them,  was a standard pattern by which developing nations typically bolstered their international economic standing. We can see this pattern very regularly in the big international debates of the day -- be it agricultural subsidies, or climate change initiatives, or IPR.

These writings, both old and relatively recent, represent contemporary and historical evidence that lay bare enough to show that notwithstanding the high moral ground positions adopted by developed nations in multilateral negotiations, were themselves not much different from the targets of their ire only a century-and-a-quarter ago.

Wednesday, November 25, 2009

Act against ACTA Secrecy

The governments of the United States, the 27 member countries of the European Commission, Japan, Switzerland, Australia, New Zealand, South Korea, Canada, and Mexico are negotiating a trade agreement named the Anti-Counterfeiting Trade Agreement (ACTA).  Despite the name, the agreement is designed to address not only counterfeiting, but a wide range of intellectual property enforcement issues, including civil and criminal enforcement, IPR in the Digital Environment, etc.. Thus, ACTA seems to be not just a simple trade agreement but something with much wider ramifications.

In most multi-lateral negotiations, generally, sunlight is usually considered the best disinfectant, mainly because secrecy cannot really be maintained over a long period. However, in this case, the specific details of ACTA have largely been kept secret. The United States Trade Representative (USTR) has refused to release even the agenda and lists of particpants for the June 2008 ACTA negotiating sessions. For over two years, the U.S. government has claimed the negotiations can be shielded from disclosure under laws protecting the national security of the United States. Two senators, Senators Bernie Sanders and Sherrod Brown have written to USTR, asking that the ACTA text be made public. Then, in response to sustained pressure for openness, the Obama administration began inviting lobbyists, corporate law firms and big companies to see the "national security" secret documents under non-disclosure agreements that by contract prohibit public criticisim or discussion of the ACTA text. 

I think this makes the secrecy even worse -- the fact that only interested pressure-groups are being allowed to see it and not the general public at large makes it seem that a conspiracy against the interests of the people who are kept in the dark (WIPO, developing countries, NGOs) is being cooked. More specifically, there is no evidence so far that ACTA contains safeguards embodied in Articles 1, 6, 7, 8, 40 and 44.2 of TRIPS, which together protect the public interest.  Further, the very fact that there is a different enforcement mechanism (not really necessary as there is a well-negotiated mechanism in TRIPS Articles 41, 44.1, 45, 46, 47, 50, and 61) gives rise to fears that the new provisions may be more restrictive or undemocratic in their impact.

The Australian Government and the Canadian Government defended the secrecy in an identically worded statement, thus:"A variety of groups have shown their interest in getting more information on the substance of the negotiations and have requested that the draft text be disclosed. However, it is accepted practice during trade negotiations among sovereign states to not share negotiating texts with the public at large, particularly at earlier stages of the negotiation."

I think this secrecy is uncomfortable, and unjustifiable. Until the ACTA, nearly all global negotiations on multilateral intellectual property norms were comparatively much more open and transparent. See these documents 1 2 3 that lay down the extent of transparency in other multi-lateral negotiations.

However, maintaining secrecy is very difficult. There have been leaks of the ACTA text, which seem to suggest that the concerns over the lack of transparency are justified, in that they seem to bend to copyright pressure groups in imposing copyright industry demands on the global Internet, that will impose policing and infringement protection responsibilities on ISPs in the signatory countries. Worse is in store. And these represent only a minuscule portion of the text that has been leaked. What else lurks beneath is a real concern.

Why should India bother about ACTA?
As this commentator puts it, "Because ACTA is intended to create new global international IP enforcement standards, including these provisions will allow US negotiators to achieve what they have not been able to do to date – ensuring that the US's overbroad implementation of the WIPO Internet Treaty TPM obligations becomes the global standard."

India must be bothered about anything that might get pushed down its throat without its consultation or involvement. That's why.

Tuesday, November 24, 2009

Strong IPR regimes counterproductive for technology transfers

Technology transfers to developing countries for climate control related technologies are not possible due to their weak IPR regimes. This oft-expressed notion has been called into serious question by a 64-page Discussion Paper titled Emerging Asia contribution on issues of technology for Copenhagen jointly authored by representatives of 5 countries' Research Institutes. They evaluated the domestic status and transfer of 3 key mitigation technologies, viz. clean coal, solar power and biofuels, to China, India, Indonesia, Malaysia and Thailand.

They point out that Malaysia and Indonesia have strong IPR regimes, but yet have not benefitted from technology transfers of these clean technologies.

They argue that strong IPR regimes may even hinder developing countries' access to technology. Where patents are honoured, as most patents are held by foreign companies, it stifles local research and prevents adaptation of technology to local needs. 

These are strong arguments indeeed, and their eventual recommendation is even more startling: TRIPS allows individual countries to override patents in a national emergency, so it could be worthwhile to declare climate change a national emergency and climate change mitigation as a public good.

This report should set the cat among the pigeons if any of the developing countries were to follow their recommendation, and their cogent arguments backed by data will surely be the topic of heavy discussion at Copenhagen later this year.


Monday, November 23, 2009

File Sharing and Copyrights

A new chapter has been written in the File Sharing/ Peer-to-peer networking legality saga.

Pirate Bay, one of the most popular Bit-Torrent trackers, which has been going strong for 5 years, has been forced to close down after a Swedish District Court decision that found them guilty of assisting copyright infringement. Their defences were two-fold: (a) they never really hosted any of the files, but only tracked where they were hosted. (b) Even Google and other search engines provide direct access to illegal .torrent files, so there was nothing specially illegal about what they did -- which is, maintain a sophisticated tracker that leads users to where the .torrent files are hosted.

In the aftermath of the verdict, several private BitTorrent trackers including Nordicbits, Powerbits, Piratebits, MP3nerds and Wolfbits, have closed down in what could be the greatest voluntary tracker collapse ever.

The MPAA had claimed damages of $15 Mn against Pirate Bay but the awarded damages were much lower, though substantial. In a parallel suit, the MPAA has won $110 Mn from TorrentSpy, another .torrent tracker site, in a US federal court.

I am now waiting to see if the MPAA also, having tasted blood, goes behind Google.

Proactively chasing trademark and brand merchandise rights infringers

Private detectives may be used to carry out checks on violation of Intellectual Property Rights (IPRs) on official trademarks and brand merchandise of the Commonwealth Games (CWG) in Delhi next year to avoid monetary loss to the tune of crores of rupees, according to a report in the Business Standard.

This is good news indeed.

Sunday, November 8, 2009

Want to use the Mahatma's pictures? Pay royalty to a German!


A canny German named Peter Ruhe has allegedly made a career of collecting Gandhi memorabilia, and has already collected over 12,000 original pictures of the Mahatma and other memorabilia and is intent on auctioning them to the highest bidders. 

Now, he has gone one step further. He is claiming royalty for use of the Mahatma's likeness in Narayan Desai's book, My Life is My Message. He has sold a few items in auction for Rs.8 crores (bought by Vijay Mallya), and also recently offered some photographs to Sabarmati Ashram for Rs.5.5 crores. He claims to be only an agent for the copyright holders in making this royalty claim.

Wonder what the Mahatma would say at international laws which can allow such a situation.