Archive for October 2010

Patent Marking Pitfalls

leave a comment

A patentee’s rights to recovery in an infringement action are substantially enhanced if their patented article is properly marked.  The relevant statutory language, found in 35 U.S.C. § 287(a), states: “Patentees . . . may give notice to the public that the same is patented, either by fixing thereon the word ‘patent’ . . . together with the number of the patent, or . . . by fixing . . . to the package . . . a label containing a like notice.”  The corollary to that principle is that an unpatented product must not be marked in that fashion.  To that end, 35 U.S.C. § 292(a) states: “Whoever marks upon . . . any unpatented article the word ‘patent’ or any word or number importing the same is patented . . . [s]hall be fined not more than $500 for every such offense.”  The intent here is to prevent manufacturers from gaining an unfair competitive advantage by falsely claiming their products are protected by patents.

The prohibition on marking unpatented articles presents a potential trap to patentees.  As a practical matter, many patentees are prone to forget to remove the marking when their patent expires.  Further, a measly $500 fine does not add much incentive to remember.  But that situation changed recently.  In The Forest Grp. Inc. v. Bon Tool Co., decided December 28, 2009, the Federal Circuit Court held that false marking penalties under 35 U.S.C. § 292 are $500 per article sold rather than $500 per product line as had been followed for many years.  As you can imagine, at $500 per article sold, the total penalties for mass produced products can reach the millions of dollars very quickly.

One example of the dramatic effect of the court’s new interpretation regarding false marking penalties was recently reported in the Detroit Free Press.  Curiously, plaintiffs in false marking lawsuits do not need to prove damages in order to prevail.  That fact, coupled with the potential for huge monetary windfalls as described above, has resulted in a rush to the courthouse.

In response to this situation, a bill was introduced in congress last month.  The bill seeks to return to the good old days when the fine for false marking was capped at $500 per product line.  The bill has been referred to the House Judiciary Committee.  But for now, patentees best be on alert to promptly remove patent markings just as soon at their patent expires.

Written by

October 31st, 2010 at 5:33 pm

Unenforceable Patents that Pack a Punch: Sounds Controversial!

leave a comment

On October 6, 2010, the Federal Circuit released a decision holding that unenforceable patents can be the cause of a justiciable controversy under Article III of the United States Constitution if the patents block a generic pharmaceutical manufacturer’s entry into the market.  The decision overturned the United States District Court for the District of New Jersey.  Teva Pharmaceuticals USA, Inc. v. Eisai Co., Ltd., 08cv2344, 2009 U.S. Dist. LEXIS 82102 (D.N.J. September 9, 2009).

Eisai holds the approved New Drug Application (“NDA”) for donepezil, which is marketed as Aricept®.  Eisai has five patents listed in the Orange Book regarding Aricept® and its use.  Ranbaxy filed an Abbreviated New Drug Application (“ANDA”) for a generic version of donepezil in 2003 and was the first generic manufacturer to attempt entering this market.  In this effort, Ranbaxy submitted four Paragraph IV certificates.  By filing the Paragraph IV certificates Ranbaxy indicates their belief that Eiasi’s four challenged patents are either invalid or will not be infringed by the generic’s entry into the market.  Additionally, as the first to file Paragraph IV certificates, Ranbaxy is entitled to 180 days of market exclusivity, in which they will face no other generic competition, once their ANDA is approved by the Food and Drug Administration (“FDA”).  The period of exclusivity begins when Ranbaxy begins commercially marketing or there is an issuance of a court judgment holding the relevant Eisai patent – those implicated by the Paragraph IV certificates – invalid or not infringed.

Teva then filed two ANDA’s for a generic version of donepezil, both of which were accompanied by Paragraph IV certificates.  Seeking approval for an ANDA, Teva commenced a declaratory action against Eisai.  The action was seeking a judgment that none of Eisai’s four relevant patents would be infringed by the approved ANDA.  Seems ordinary, but there is a twist.  Eisai claimed that the district court lacked subject matter jurisdiction because they had previously signed a covenant not to sue with Teva regarding the relevant patents.  Teva responded claiming they did suffer an injury because the four relevant patents are listed in the Orange Book.  Thus, Teva cannot receive FDA approval for their ANDA until Ranbaxy’s period of exclusivity is completed.  However, the period will not start until the Eisai patents are found invalid.  So, Teva faces being blocked from market entry by patents that cannot be enforced against them.

The Federal Circuit faced the question whether a justiciable controversy existed in this case by examining whether there was an actual controversy.  The court noted that “regardless of whether Eisai could bring an infringement action with respect to the [relevant] patents, under the Hatch-Waxman Act Teva still needs a court judgment of noninfringement or invalidity to obtain FDA approval and enter the market.”  So, even though the Eisai patents were unenforceable, Teva was facing enough harm to justify receiving a declaratory judgment.

This case ensures that the second generic manufacturer to the market will not be blocked by the first generic for an unduly long period of time.  It is now clear that the second generic to file an ANDA has the power to control their fate because they are harmed by any potential patents blocking their entry to the market, even if those patents are unenforceable.  Being blocked from the market is harm and is enough to justify a court ruling.  This decision is clearly in-line with prior Federal Circuit decisions which have held that a controversy exists when an issue is “definite and concrete touching the legal relations of parties having adverse legal interests.”  This opinion also seems to further the interest of FDA regulation of generics.  The first generic to file a Paragraph IV certificate should be rewarded for their efforts with a period of generic exclusivity.  However, a prolonged bar to other generics from the market does not further the interests of the regulatory scheme.

Written by

October 31st, 2010 at 5:07 pm

Due to Russian crackdown Amount of Spam Email Drops

leave a comment

I had not thought about email Spam mail in ages.  It used to be that whenever I logged into my email, including my school email accounts, the vast majority of my emails were unsolicited junk—advertisements for Viagra or other drugs available on the cheap through online “pharmacies,” offers for cash in exchange for sending emails, get rich quick schemes and emails with provocative subject headings that led to XXX websites.

Earlier this week The New York Times published an article chronicling a drop in the past month of about 50 billion email spam messages per day.  There still remain about 200 billion spam messages in circulation daily, a staggering number, especially considering that my inbox is only clogged these days by the various updates, newsletters and advertisements to which I have subscribed or agreed to, whether wisely or otherwise.  Regardless, they have been solicited.

According to the Times, Russia, a “haven” for cyber criminality has become a major exporter of spam due in large part to the alleged work of “spam kingpin” Igor Gusev.  Although he denies a connection, Gusev is widely believed to have run, which paid spammers to promote online pharmacies.  The SpamIT operation closed inexplicably on September 27th and was followed by the 50 billion messages per day curtailment of junk mail sent by spammers.  It turns out the Russian government, which has traditionally been lax in its prosecution of cyber crime, had cracked down and charged Gusev not with cybercrime but with operating a pharmacy without a license and failing to register a business.

As someone who had not thought about spam in years, I was amazed at the amount of junk mail that is sent daily and also the amount of spam for which one man seems to have been responsible.  Users like me no longer feel the effects of spam due to the ubiquity of spam filters.

Thus, I wondered why I should care about spam and its continued use.  It seems that an increased use of spam filters should be followed by a loss of incentive to use spam as a marketing tool.  Who is still a) receiving this spam and b) investing in the products advertised?  Clearly those people exist since Gusev is reported to have earned 120 million dollars from his company.  As long as spam still reaches even a limited audience, it seems likely to continue.  There is little cost associated with email spam.  Once hackers have lists of email addresses, emails just need to be sent.  Any kind of more legitimate advertising is clearly more costly—both economically and in terms of time—paying hosts to post the advertising, making paper copies, phone calls, walking from bulletin board to bulletin board to hang signs.  As long as people open spammed email messages and make purchases, spamming is clearly a cost-efficient, if illegal, method of marketing.

However, the costs are borne by the rest of us.  Due to the fact that different countries have different laws or policies regarding the use of spam, it is difficult to mitigate the effects.  We have to bear the costs of providing and operating spam filters and tracking the use of spam.   Moreover, in terms of privacy we are still reminded that hackers continue to gather our information for these uses.  The New York Times also writes that spam accounts for 90% of all email traffic on the internet.  I cannot help but wonder in what ways the internet would be a difference place in a spam-less world.

Written by

October 31st, 2010 at 5:01 pm

Emergency Powers in Cyberspace

leave a comment

In the past year, there has been an increase in the number of hack attacks on U.S. companies. In one particularly worrisome case, the attacks were targeted against Google and 33 other companies, including financial institutions and defense contractors. In light of this situation, several senators are drafting a bill that would give the president the power to declare an emergency next time there is a threat in cyberspace. Companies could be forced to take measures, or even shut down, to combat the threat.

According to Reuters, which reported to have obtained a copy of the draft bill on September 21st, the draft is the result of a merger of two cybersecurity bills. One of these is bill S.3480, titled Protecting Cyberspace as a National Asset Act of 2010, which was introduced by Senator Joseph Lieberman in June. The bill tries to establish an Office of Cyberspace Policy within the executive branch and amend the Homeland Security Act of 2002 to add a new National Center for Cybersecurity and Communications (NCCC) within the Department of Homeland Security. The bill also allows the president to declare a “national cyber emergency” to companies classified as part of the nation’s critical infrastructure network, which would then give the NCCC Director the power to enforce cybersecurity policies over the private sector.

There will surely be strong opposition from technology and telecommunications firms that might be classified as critical infrastructure. These companies will have to front the costs of implementing security measures or be shut down. Undoubtedly, some or all of these costs would be passed down to the consumer. On the other hand, as Senator Lieberman put it in a press release, “our economic security, national security and public safety are now all at risk . . . .”

Written by

October 31st, 2010 at 4:48 pm

FDA Silence on Chronic Disease Management Software

leave a comment

In the wake of the HITECH Act, which encourages healthcare providers to use Electronic Health Records (EHR) through subsidies and reduced Medicare payments, patients may know that providers routinely store and transmit electronically their personal health-related information.  Patients with chronic diseases might, however, be surprised to learn that many aspects of their medical care, from the interpretation of lab results, to medication dosage calculations and treatment recommendations, are increasingly handled by software instead of clinicians.

Chronic Disease Management Software (CDMS), like CoagClinic by Standing Stone Inc. and UM-developed AviTracks by Avicenna Inc., either integrates with EHR or works in its place, guiding clinicians through each step of their workflow.  CDMS typically leads clinicians through patient enrollment, assigns medication dosages based on medical history and drug interactions, aids in scheduling lab tests and clinic visits, automatically alerts clinicians about problematic lab results and missed appointments, and recommends treatment based on standardized treatment protocols.  Thus, with CDMS, clinicians are outsourcing some of their most important functions like medication dosing, lab interpretation, and treatment decisions to software programmers.  Considering CDMS is often used in anticoagulation clinics, where patients take warfarin, a risky anti-clotting drug with many adverse interactions, also used as rat-poison, it might seem problematic that important dosage and lab-interpretation tasks are largely left to programmers.

So far the FDA, which has authority to regulate devices “intended for use in the . . . cure, mitigation, treatment, or prevention of disease,” 21 U.S.C. § 321(h), has declined to regulate CDMS or related Health IT.  This is somewhat inconsistent with its past statements on medical software.

In 2002 the FDA released unofficial guidelines for manufacturers whose medical devices incorporate software, in their General Principles of Software Validation.  This guidance document specifies how manufacturers can comply with 21 C.F.R. 820, which requires that software undergo validation, design reviews, and testing.  The FDA justifies these requirements for software by noting that 7.7% of all medical device recalls were due to software failures, and “the vast majority of software problems are traceable to errors made during the design and development process.”  They conclude that “software engineering needs an even greater level of managerial scrutiny and control than does hardware engineering.”

In the 2005 Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices, the FDA issued additional guidance, this time requiring premarket approval for devices incorporating software.  This guidance document outlines steps that a manufacturer must complete before the FDA will initially approve a device, including analysis of the device’s “level of concern,” hazard, design, and software verification and validation.  Despite arguably having jurisdiction by the plain language, The FDA does not require pre or post market approval of CDMS or EHR software, instead leaving governmental oversight in this area to the Office of the National Director for Health IT.

One potential explanation is that the FDA is backing off CDMS, and Health IT generally, due to the enormous potential for cost savings and thus lower provider fees.  CDMS significantly limits bad outcomes due to clinician errors, thereby improving patient outcomes and clinic efficiency, and reducing exposure to malpractice liability.  CDMS is somewhat unusual in its potential to lower healthcare prices.  Most new medical technology adds to the cost of healthcare, as patients demand the newest treatment technology and clinicians seek opportunities to charge higher fees.  Additionally, because of this industry’s youth, CDMS manufacturers, like UM’s Avicenna, are likely to be smaller than the typical medical device manufacturer.  Burdensome regulations may stifle development in this new dynamic market.  Perhaps the FDA has recognized that in order to capitalize on CDMS’s huge potential benefits on patient outcomes, which will slash provider’s costs and ultimately prices, they must hold off imposing extra costs into the manufacturing process, at least in the short term.

Written by

October 30th, 2010 at 3:09 pm

Funding for embryonic stem cell research can continue, for now

one comment

The Circuit Court of Appeals for the District of Columbia decided last week to allow federally funded human embryonic stem cell research to continue, extending a temporary stay of a lower court’s injunction barring the funding.  A district court ruled in August that government funding for embryonic stem cell research was barred by the 1996 Dickey-Wicker Amendment, which prohibits government funding for any research in which a human embryo is destroyed.

The government successfully met the standards required for a stay pending appeal, part of which was demonstrating irreparable harm without the stay.  The government argued that the stay was needed to avoid terminating research projects already in progress, which would have a negative impact on the development of new treatments for serious illnesses.  Embryonic stem cells are derived from embryos and have the ability to differentiate into any cell type and propagate indefinitely.  Because of these properties, embryonic stem cells may one day be able to contribute to the treatment or cure of diseases such as cancer, Parkinson’s Disease, type 1 diabetes, blindness, and spinal cord injuries.

This was positive news for the White House, as President Obama has made stem cell funding by the National Institute of Health a top priority, broadening federal funding from former President Bush’s order limiting funding to research dealing with already existing cell lines.  Since Obama lifted the former restriction in 2009, the NIH has funded over $500 million in human embryonic stem cell research.

The current case was brought by James Sherley and Theresa Deisher, scientists who work with adult stem cells and are ethically opposed to embryonic stem cell research.    They argue that funding for human embryonic stem cell research unfairly  takes away money and support for their own research, a position supported by right-to-life groups who oppose the destruction of human embryos.

It could be several months before the appellate court issues a final ruling on the legality of the funding, and whichever side loses will most likely appeal to the Supreme Court.  Congress, however, could (and some think should)  potentially end the debate earlier by rewriting the bill that funds the NIH to explicitly allow human embryonic stem cell research.  Similar bills were passed previously in 2006 and 2007, but vetoed by Bush.  The most recent reincarnation of the bill, the Stem Cell Research Advancement Act, is currently before the house.

However, regardless of whether the Supreme Court or Congress makes the final rule, the current stay of the injunction is evidence that the tide is turning in favor of human embryonic stem cell research, a victory for both researchers and those suffering from diseases that may one day be cured by it.

Written by

October 18th, 2010 at 11:24 pm

Six Major Technology Firms Settle with D.O.J. After Antitrust Probe

leave a comment

On Friday, September 24, 2010, The U.S. Department of Justice announced that it had reached a settlement with six prominent technology companies (Adobe, Apple, Google, Intel, Intuit, and Pixar) relating to a probe on corporate recruiting policies that potentially violated antitrust laws.  The six aforementioned companies allegedly formed agreements which prevented them from directly soliciting each other’s employees.  The proposed settlement, if approved by the U.S. District Court for the District of Columbia hearing the issue, will prohibit the companies from utilizing anticompetitive non solicitation agreements for five years, and will require the companies to implement compliance systems to ensure these recruiting practices are discontinued.  This announcement is the latest headline in a series of antitrust investigations by the D.O.J. involving the technology sector – recently the department had investigated overlap between the Apple and Google Boards of Directors and considered the antitrust implications of the Google Books settlement.

In its press release, the D.O.J. noted the high technology sector’s strong demand for employees with “advanced or specialized skills,” and alluded that one benefit of the settlement will be better career opportunities for these valuable workers because they would be in an unrestrained job market with a “properly functioning price-setting mechanism” (i.e. more opportunities for employment and the potential for higher salaries due to increased leverage during salary negotiations).

Interestingly absent from the press release was mention of the benefits of the settlement to consumers of technology and media.  In addition to the advantages gained by the workers who were previously kept “under wraps” by these companies, technology and media consumers may soon be indirect beneficiaries of an increasing the pace of technological development and also the broadening of the engineers’ technical capabilities stemming from the acquisition of new knowledge and philosophies garnered from working in a new environment.

Written by

October 18th, 2010 at 11:20 pm

Improved Patent Searching

leave a comment

A premise of our patent system lies in striking a bargain with the public. A monopoly of limited duration is granted in return for an invention’s disclosure. The hope is that patent publication will “promote the progress of science and the useful arts” by publicizing new and innovative ideas.

Underlying the bargain is an assumption that issued patents and published applications are available to the public, and the public upon conducting a search can find relevant information. However, patents and legal documents in general are unique compared to other knowledge. In the realm of patents, relevant search results need to identify whether a patent contains relevant or related knowledge as opposed to an exact match of search terms. While this is the same type of problem that plagues searches for legal material in general, challenges facing patent searchers are increasingly acute. Patents are increasingly global in scope and require searching across diverse classification schemes and languages.

The problem of accurate and precise patent searches has the potential to undermine the goals of the patent system by impeding progress and wasting investment dollars. For example, in May 2010, Rolls-Royce sued United Technologies Corp.’s Pratt & Whitney unit claiming infringement of a method of manufacturing quieter and more efficient jet engines. Pratt & Whitney invested more than $1 billion to independently develop a more-efficient design. (See Rolls-Royce PLC v. United Technologies Corp., 10cv457, 2009 U.S. Dist. LEXIS 127214 (Alexandria)); see also interesting counter-suit as reported by

You can be sure that a sophisticated party such as Pratt & Whitney performed a freedom to operate search and analysis. Yet such searches are inherently difficult and time consuming. I would like to highlight a foreign approach that has been around since 2006, but has recently received an important improvement that will improve patent searching. The article is forthcoming in a technical journal, but may pass below the radar of mainstream legal publications.

In 2006, at the first international conference on Semantics and Digital Media Technology in Athens, researchers presented an article titled, PATExpert: Semantic Processing of Patent Documentation outlining the technical details of their algorithm. PATExpert is an EU funded consortium, and includes the European Patent Office as a partner. (For a competing and different approach to patent searching used in the Korean Patent Office, see Segev, A.; Kantola, J. Patent Search Decision Support Service, 2010 Seventh International Conference on Information Technology: New Generations (ITNG), Digital Object Identifier: 10.1109/ITNG.2010.99.)

In distilled form, the PATExpert software relies on an ontology to create relationships between data common to different patents. In addition to traditional linguistic matching, the software attempts to return results based on how a particular patent is related to others. In its current state PATExpert uses the following search methodologies: full text search, metadata search, image similarity search, semantic search, and document similarity search. Few or non-existent connections indicates that there is no relationship between two patents. However, a large number of connections between two patents signifies a strong relationship, and despite linguistic differences may indicate two patents have a lot of substance in common.

For example, images such as photographs, diagrams, flow charts, and drawings are important factors in the representation of a patent’s disclosure and are modeled by an ontology. Additional data such as the patent holder, inventor affiliation, and assignee are also captured in an ontology. An additional ontology captures linguistic similarities. By looking at each ontology independently or as a combination, a representation of related patents may be obtained. The hope is that more relevant results, independent of linguistics, can be returned in performing a patent search.

The forthcoming article Iterative Integration of Visual Insights during Scalable Patent Search and Analysis by Koch, S; Bosch, H; Giereth, M; Ertl, T (IEEE Transactions on Visualization and Computer Graphics, Volume: PP , Issue: 99 Digital Object Identifier: 10.1109/TVCG.2010.85) outlines development of a visual interface for the PATExpert algorithm. The interface called “PatViz” makes PATExpert user friendly and potentially more powerful. Using PatViz, users can now graphically construct patent queries, similar to constructing a flow chart using Microsoft Visio. Furthermore, and perhaps more important than query construction, each step or node in a query can be enlarged to show the patents that comprise the results returned at the particular node.

Take a simple example as described in the article. In searching for patent that contains the word “disk recorder” and not the terms “CD” or “DVD,” a user can enlarge each of the nodes containing intermediate results without the term “CD” or “DVD” or enlarge the node containing the term “disk recorder” without any other restrictions.

Data at a particular node, or at a final node, may be represented in 11 different visualizations. For example, one can view a graph of how results relate to different subject matter classification schemes or which results are U.S. patents. Another helpful representation is a “Term Cloud.” A page of text is displayed containing all search terms used. PatViz highlights the most frequently found terms by enlarging the font of the term, the largeness of the font corresponding to the frequency of the term in search results. This gives much more of a “feel” for the content and scope of a patent search.

The article is full of technical details, but important for patent lawyers is the possibility of easier and less time consuming freedom to operate analysis. How these more efficient patent searchers will affect patent prosecution and litigation remains to be seen.

Written by

October 11th, 2010 at 12:30 pm

Search the Blog