Below, an overview of the Thaler decision is provided, as well as an assessment of its implications if upheld on appeal. A comparison of this decision to other decisions on the issue around the world is then provided with a final word on the issues that are in play and things for the AI industry to consider.
Thaler – An overview of the decision
Dr Stephen Thaler invented the Device for Autonomous Bootstrapping of Unified Sentience (DABUS). DABUS is an AI system built to generate new ideas. Of its own accord, DABUS envisioned an improved beverage container and a flashing light for use in emergency situations – inventions that Dr Thaler claimed he would not have been able to come up with himself. Dr Thaler then filed for patent applications in a number of countries including Australia, Canada, China, Europe, Germany, India, Israel, Japan, South Africa, the UK, and the US.
In Australia, the Deputy Commissioner denied Dr Thaler’s patent application naming DABUS as inventor because it failed to name a human inventor. Treating DABUS as an inventor was found to be inconsistent with the Patents Act 1990 (Cth) and the Patents Regulations 1991 (Cth). Dr Thaler then sought judicial review of that determination.
The Thaler decision was essentially an exercise in standard statutory interpretation. The Australian Patents Act of 1990 (Patents Act) states that a patent for an invention may only be granted to a “person” who is the “inventor”, or who would be entitled to have the patent assigned to that person, or derives title to the invention from the inventor or the aforementioned assignee. There is no definition of the term “inventor” in the statute. In the absence of such a definition, Justice Beach found that there was nothing inherent in the term “inventor” that required it to be limited to humans. Indeed, the court found that no mental act or state of an inventor is required for an “invention” or “inventive step”. An inventor is an “agent noun” – an agent can be a person or thing that invents. Furthermore, Justice Beach found that allowing AI systems like DABUS to be named as inventors on applications furthered the Patent Act’s object, as set out in s 2A, to “[promote] economic wellbeing through technological innovation and the transfer and dissemination of technology” and “[balance] over time the interests of producers, owners and users of technology and the public”. Accordingly, His Honour held that “computer inventorship would incentivise the development by computer scientists of creative machines, and also the development by others of the facilitation and use of the output of such machines, leading to new scientific advantages”.1
This also reflects the reality that for many otherwise patentable inventions, technological assistance is necessary for a human inventor to complete their innovation. It was further noted that the Deputy Commissioner’s decision would mean that an otherwise patentable invention with no human inventor could not receive the protection of a patent.
Justice Beach addressed two key implications of his ruling: inventiveness and entitlement.
To be granted a patent, an invention must be novel, useful, and involve an inventive step. In the course of considering that the existence of an inventive step is a separate consideration to how it arose, Justice Beach considered that the threshold for inventiveness may rise as a consequence of allowing AI systems to be considered inventors. Under s 7(2) of the Patents Act, an invention will not involve an inventive step if the invention would have been obvious to “a person skilled in the relevant art in the light of the common general knowledge”. In the future, a “person skilled in the relevant art” may be taken to be assisted by or have access to AI, or to have knowledge of developments produced by AI in the relevant field as part of the common general knowledge. However, since this point was not directly in question, it is strictly obiter, and merely provides an indication of how the law could evolve in the future, if this decision is upheld.
Justice Beach realised that his ruling raises an issue over the entitlement or ownership of the product invented by AI. Human inventors can transfer their inventorship rights to others by contract. An AI system is not a legal entity and therefore cannot enter into agreements that would allow it to transfer its inventorship rights. While acknowledging this, Justice Beach nevertheless stated that under s 15(1)(c) of the Patents Act, Dr Thaler could “derive” title to the invention from DABUS. Section 15(1)(c) states that a patent for an invention may only be granted to a person who derives title to the invention from the inventor or a person who is entitled to be an assignee of the patent. Inventions can be possessed, and ownership can arise from possession. Dr Thaler derives ownership of the inventions from his ownership and control of DABUS.2
This means that human ownership is possible where an AI system is designated as the inventor. The owner, or possibly controller, of the AI system will have and be able to exercise proprietary rights in the invention.
Apart from South Africa granting a patent listing DABUS as inventor, overseas approaches have tended to differ from the Australian decision, with two distinct thematic departures: the reading of the relevant legislation and the treatment of Dr Thaler’s ownership entitlement.
In 2020, the European Patent Office (the EPO), the UK Intellectual Property Office (the UKIPO), and the US Patent and Trademark Office (the USPTO) separately refused Dr Thaler’s patent applications. Unlike the Federal Court of Australia, the England and Wales High Court upheld the decision of the UKIPO.3 The decision to reject Dr Thaler’s patent application has also recently been affirmed in the UK Court of Appeal, with Lord Justice Birss dissenting.4 In America, on appeal from the USPTO, the United States District Court for the Eastern District of Virginia also declined to allow DABUS to be an inventor – this decision came after that of Australia.5 The EPO also noted in their decision that China, Japan and Korea also require inventors to be human.
The EPO, UKIPO, and USPTO refused Dr Thaler’s applications because their respective legislation did not allow for non-persons to be listed as the inventor of the patents. The respective judicial decisions in the UK and US upheld this reasoning. By contrast, Justice Beach was able to find that the Privacy Act in Australia was not so prescriptive. It was possible to take a more purposive reading of the statute and allow an AI system to fit the definition of “inventor”.
Although the EPO, UKIPO, and USPTO declined Dr Thaler’s application, this does not necessarily mean they have closed the door on AI inventorship. The UKIPO has since consulted on a range of options, including legislation, to address the fact that AI tools cannot be listed as inventors. The UKIPO considered it advisable to debate the issue more widely rather than trying to fit novel cases into existing statutes.
A major reason for the EPO and UKIPO rejecting the applications was due to the fact that ownership of a patent in those jurisdictions must be assigned in specific ways that an AI system is presently unable to fulfil.
Because AI systems are not legal persons, the EPO considered that they have no rights and cannot be granted such rights as an inventor would hold under the current statutory scheme. Under European patent law, the owner of the invention is more important than the inventor, and an applicant who is not the inventor must state how they derive the right to the invention. Two avenues are possible – the right of ownership can arise out of an employment relationship or by succession in title. The EPO rejected both on the basis that AI systems have no legal personality. AI systems cannot be employed, and they cannot transfer rights to a successor in title because they have no legal title to their output.
The UKIPO took a similar route. The Patents Act 1977 (UK) only allows entitlement by virtue of a rule of law, by way of an agreement with the inventor before the making of the invention, or for successors in title of the inventor by way of the doctrine of accession. The UKIPO also considered that Dr Thaler could not claim ownership under any of these avenues because DABUS, as a machine, is not a legal person.
Arnold LJ and Laing LJ in the UK Court of Appeal rejected accession because of its roots in the concept of exclusive possession. Accession only applies where tangible property produces new tangible property, such as grapes being turned into wine, but cannot occur where tangible property, such as DABUS, produces intangible property, such as a patent. Exclusive possession can be applied to tangible property but not intangible property. While only one party can possess a tangible object, numerous parties can use an invention at the same time.6
Justice Beach’s decision may yet be appealed, but it is an open question as to whether it would be desirable for the legislature to deal with the issue of AI inventorship by updating the Patents Act. New legislation may allow the opportunity for greater certainty in criteria, particularly as what constitutes an AI system worthy of inventorship is not clear at present.
Ownership vs inventorship
As discussed above in regards to entitlement, Justice Beach reached an opposing conclusion to that of the EPO and the UKIPO. In Justice Beach’s view, Dr Thaler owns DABUS’ output in the same way that the owner of an animal would own that animal’s progeny, or occupiers of land own the fruit or crops gained by their labour.7 Similarly, the EPO accepted that the owner of a machine can own the machine’s output, but that was insufficient to ground ownership rights under European patent law. Under Australian law, Justice Beach ruled that it is.
His Honour arrived at that conclusion because the Patents Act allows an owner to “derive” rights from an inventor. Derivation is broader than the movement of title – possession of an AI system, which does not itself have title to legal rights, was found sufficient to ground ownership. But if it is clear from the start that a particular inventor should have no ownership rights, then is that inventor really deserving of the status of inventorship? We would hesitate to claim that our animals and crops and basic computers are inventors, and it is in large part this hesitation that makes it acceptable for us to own their outputs.
Whether AI should be an inventor depends on whether the particular technology is sufficiently autonomous to be free of human interference, but this is both a legal and a technical question. While the two are likely to track one another, the legal question turns on what is legally important about the category of inventor. If the purpose of having an inventor is to recognise the rights that one has in one’s own creation, then an AI system may not be there yet. This is the route that the EPO and UKIPO took. If the purpose is, instead, to protect the invention to encourage dissemination and innovation, then ownership takes priority over inventorship and an AI system can be listed as an inventor within the existing framework in order to give effect to a patent and human ownership. This is the Australian conclusion. It might be said, therefore, that Justice Beach’s decision prioritises ownership as much as the EPO and the UKIPO, but leads instead to the opposite conclusion due to a different rationale. What mattered for Justice Beach was that ownership should be granted where there is a patentable invention, regardless of the inventor’s identity. The focus was on the social and economic benefits of granting a patent.
The Australian position leads to an elevation of the AI system to the status of inventor while simultaneously downplaying that role, which leaves some key issues on inventorship unresolved and makes it a question worthy of further exploration.
This section concerns the technical side of whether a particular technology is sufficiently autonomous. Justice Beach was able to find that DABUS is worthy of inventorship, though not ownership, because he accepted DABUS as semi-autonomous. He describes it as “a form of neurocomputing” built by combining two kinds of artificial neural networks (ANNs), that can generate new concepts and predict the novelty, utility, or value of those new concepts. He accepts that DABUS can be described as self-organising, capable of generating new patterns of information and adapting to new scenarios without additional human input, and able to mimic aspects of human brain function. DABUS is not just a human-generated software programme that fits solutions to the problem.8
Contrary to this, machine learning techniques such as ANNs and evolutionary algorithms have been applied for decades in the fields of science and engineering, yet technical accounts do not state that these techniques constitute “autonomous entities”.9 Though the techniques can perform tasks without explicit programming from humans, there is not a complete absence of instructions determining how the input-out relation is arrived at. AI. While legal narratives tend to claim that such techniques generate inventions “autonomously”, technical literature across life sciences, molecular modelling, and various kinds of engineering usually describes them as “automated”.10 Automation means that a device can perform a task without human participation during performance, whereas autonomy tends to refer to the ability to choose one’s own rules and goals and work towards them. From a technical perspective, there may be some way to go before we can treat machines as true inventors. It may not be as problematic, as was claimed, to name Dr Thaler as the inventor of DABUS’ output.
Equally, however, the importance of the technical evaluation is dependent on the legal rationale for the category of inventorship. What this suggests is that there is currently no uniform way to understand what inventorship means, what it is that machine learning really entails, and how the two should fit together to inform IP law.
Other types of AI-generated IP
Justice Beach’s decision may lead the way for other AI-generated creations to be given recognition under IP law, though any decisions by either government bodies or the courts will likely turn on the construction of the relevant statutes.
Currently in Australia, copyright requires a human creator. Each work must have an author who is a “qualified person”. Under s 32 of the Copyright Act 1968 (Cth) (the Copyright Act), a “qualified person” is defined as an Australian citizen or a person resident in Australia, or a body corporate incorporated in Australia. AI systems are not human persons, citizens, or bodies corporate. In addition, for a work to be copyrighted, it must be original. Originality requires a degree of independent intellectual effort during the creation of the work.11 It appears less likely that AI-generated works could be granted copyright in Australia under the Copyright Act.
Section 13 of the Designs Act 2003 (Cth) (the Designs Act) only allows the registered owner of a design to be a “person” who created the design, or a person who derives title from the creator. The language in the Designs Act is similar in both its use of the word “person” and the structure of the section. Justice Beach found that requiring an inventor to be a “person” did not exclude AI systems from inventorship, so that avenue may be open for designs. Interestingly, in the UK, s 2 of the Registered Designs Act 1949 (UK) removes this possibility by explicitly providing that for designs generated by computer such that there is no human author, the person who made the arrangements necessary for the creation of the design is taken to be the author.
Note, however, that the Copyright Act and the Designs Act do not contain the object clause found in s 2A of the Patents Act that assisted Justice Beach in his purposive reading of the statute. As such, their policy considerations may differ.
- Justice Beach in the Federal Court of Australia has ruled that AI systems can be named as inventors on a patent application, though they cannot be owners. It was found that the Patents Act 1990 (Cth) did not require inventors to be human, and Dr Thaler was able to derive proprietary rights from an AI system by possessing that AI system. The decision may yet be appealed.
- The EPO, UKIPO, and USPTO rejected Dr Thaler’s patent applications because their respective legislation did not permit non-humans to be inventors. This was in large part because they could not find a legal grounding on which Dr Thaler could derive proprietary rights from an AI system.
- Technical perspectives on machine learning applications tend to differ from legal perspectives on the topic, with legal discourse more likely to see such applications as “autonomous”, whereas technical discourse refers to those same applications as “automated”. This may hold implications for the threshold at which AI systems can be considered inventors.
- Inventiveness as a threshold on patent applications may increase in future to account for AI assistance or knowledge of AI systems.
- While AI systems are less likely to be granted copyright under Australian law, there is a possibility that they may be allowed to be the registered owner of designs. However, the decision remains open for both, and has yet to come under consideration.
1 Thaler v Commissioner of Patents  FCA 879 (hereafter, Thaler) at .
2 Thaler at -.
3 Thaler v The Comptroller-General of Patents, Designs and Trade Marks  EWHC 2412 (Pat).
4 Thaler v Comptroller General of Patents Trade Marks and Designs  EWCA Civ 1374.
5 Thaler v. Hirshfeld, 20-903, U.S. District Court for the Eastern District of Virginia (Alexandria).
6 Thaler v Comptroller General of Patents Trade Marks and Designs at -.
7 Thaler at .
8 At .
9 Daria Kim, “AI-Generated Inventions: Time to Get the Record Straight?”, 69 GRUR International 5, May 2020.
11 IceTV Pty Ltd v Nine Network Australia Pty Ltd  HCA 14.