algorithm Archives - IPOsgoode /osgoode/iposgoode/tag/algorithm/ An Authoritive Leader in IP Wed, 14 Jul 2021 16:00:09 +0000 en-CA hourly 1 https://wordpress.org/?v=6.9.4 15,000 Eyes in New York City /osgoode/iposgoode/2021/07/14/15000-eyes-in-new-york-city/ Wed, 14 Jul 2021 16:00:09 +0000 https://www.iposgoode.ca/?p=37796 The post 15,000 Eyes in New York City appeared first on IPOsgoode.

]]>
Photo Credits: (Unsplash)

Tiffany WangTiffany Wang is an IPilogue Writer, IP Innovation Clinic Fellow, and a 2L JD Candidate at Osgoode Hall Law School.

Ěý

“Always eyes watching you and the voice enveloping you. Asleep or awake, indoors or out of doors, in the bath or bed—no escape. Nothing was your own except the few cubic centimeters in your skull.”

Nineteen Eighty-Four by George Orwell

Sprawling throughout New York City, cameras observe and record faces and movements. These seemingly omnipresent lenses conduct and generate facial-recognition data for the New York Police Department (“NYPD”), capturing footage to identify individuals for criminal enforcement efforts.Ěý

Surveillance by security cameras is not unique to the Big Apple. Chinese police forces deploy , notifying security personnel whenever or target appears in sight. . In total, operate to surveil the population of New York.Ěý

Not only do public cameras tower over the city, but also record information that the NYPD may access with permission.

The Big Apple’s capture action in an area less than two square miles. Over ninety percent of residents are racial minorities.Ěý

, serves the United States federal and state police forces. In 2019, the revealed that Idemia’s algorithms are prone to confuse racial minorities’ faces. Similarly, the

of facial data raises concerns of . If commercial algorithms continue to demonstrate significant errors in identifying individuals with varying skin tones, these concerns will quickly escalate to . Tensions may ultimately result in the NYPD

Could over 15,000 eyes create a dystopia? After all, .ĚýĚý

The post 15,000 Eyes in New York City appeared first on IPOsgoode.

]]>
Algorithmic Accountability: Prof. Frank Pasquale’s Thoughts on Artificial Intelligence in the Law /osgoode/iposgoode/2017/04/06/algorithmic-accountability-prof-frank-pasquales-thoughts-on-artificial-intelligence-in-the-law/ Thu, 06 Apr 2017 14:26:27 +0000 http://www.iposgoode.ca/?p=30507 Algorithms are everywhere. Applied to systems like personal assistants, financial exchanges, and self-driving cars, computers now permeate almost every aspect of modern life. But how far should this algorithmic revolution extend into the law? Should contracts, judgement, and litigation strategies follow suit? These questions are at the forefront of ĚýProfessor Frank Pasquale’sĚýresearch and were the […]

The post Algorithmic Accountability: Prof. Frank Pasquale’s Thoughts on Artificial Intelligence in the Law appeared first on IPOsgoode.

]]>
Algorithms are everywhere. Applied to systems like personal assistants, financial exchanges, and self-driving cars, computers now permeate almost every aspect of modern life. But how far should this algorithmic revolution extend into the law? Should contracts, judgement, and litigation strategies follow suit?

These questions are at the forefront of ĚýĚýresearch and were the topic of discussion at his recent talk as part of the IP Osgoode Speaks Series.Ěý Prof. Pasquale brought with him a simple message to his talk: the law ought to be “A Rule of Persons, Not Machines.”

Correcting Bias with Bias

To begin, take the pinnacle actor of our legal system: the judge. Judges are human, after all, and they bring human biases with them to the court room. Studies demonstrate that judicial outcomes can depend on variables such as the performance of the Ěýor whether the decision was made before or after the . In contrast, computer algorithms can produce more consistent legal outcomes across a given set of cases. Humans are biased, after all, so why not replace them?

Algorithms have biases too, answers Prof. Pasquale. Their outcomes depend on the humans that develop their code. Which factors should be given heavier weight in a computer’s decision? How much room should be carved for the protection of constitutional rights? What if the software contains undiscovered errors? How should the system import contemporary societal values into its decision? In considering these questions, Prof. Pasquale shows that greater consistency does not equate to fewer biases.

In addition to these coding difficulties, computer algorithms are much better at evaluating backward-looking inputs than solving forward-looking problems. They may therefore be unable to effectively replace the law-making role of the judiciary. Take, for example, a situation where a computer must decide if constitutional rights should be extended to novel situations. Analysing historical data to determine the likelihood of a human judge allowing such an extension is not difficult, but fully predicting the extension’s effects on society is. It’s almost near-impossible: there are simply too many variables to consider, many of which, such as personal values, cannot be reduced to simple metrics.

Finally, Prof. Pasquale contends that algorithms lack transparency. Human-made judgments reduce legal logic into an intelligible, written form. This writing may be applied, built upon, or criticized by subsequent thinkers in the legal system. Algorithmic judgments do no such thing. Perhaps the coding and mathematics used to process inputs are intelligible, but they provide no guidance as to how the law is, was or ought to be applied.

Blockchain and Property Law: A Legal “Wild West”

Prof. Pasquale turns to Ěýconcerns with emerging blockchain technologies like Ěýto demonstrate the real-world legal problems that arise when cold, efficient machines replace human judgement.

Institutions operated by human beings have traditionally been responsible for tracking and facilitating the exchange of goods and services for currency, at least for big-ticket items. If X purchased a car from Y, the bank would ensure a precise number of dollars transfer from X’s account to Y’s account, and the vehicle would be registered with the state in X’s name. Money is traded for ownership, and each step is traceable.

This is not the case with blockchain.

Instead of relying on trusted, human-run institutions to maintain the records, blockchain technology relies on its users – the public at large – to track one-another’s balances on their computers. A transfer’s legitimacy is verifiable because each user holds a secret digital key that they use to authorize the movement of their funds or property. This key is nothing more than a sequence of numbers – it has no name attached to it, and the only requirement to use it is knowledge of its sequence.

Although blockchain is revolutionizing transactions by cutting out the “middle man” and making them instantaneous and secure, they cause some serious legal headache. First, losing your authentication key means that you lose access to its value stored on the blockchain. If your car title is recorded on the blockchain in association with your digital key, losing your key means you’ve lost the only proof you have that you own the car. Compare this to losing your paper title to the car: you’d simply need to stop by your local registry with identification and pick up a new one.

Second, imagine your digital key gets stolen by a hacker who hops in your car and drives off into the sunset. Not only do you lack proof that he stole it, but, for all intents and purposes, the blockchain now considers him the true owner. The law cannot help you.

Finally, consider the law’s benefits and protections conferred on you by virtue of you owning your car. If the car is defective, you can return it to the seller. If your friend doesn’t return the car after you lend it to them, the law can force them to give it back. Consumer protections like these and many others are simply unavailable to you if the law does not recognize your ownership.

Prof. Pasquale makes it clear that although blockchain may become useful for certain applications, it illustrates the danger of replacing human-run systems with purely algorithmic ones. Blockchain is not going to make property law go away.

Algorithmic Accountability

It is a grim future, argues Prof. Pasquale, where machines replace the role of human judgement in its entirety. What role, then, does he see computers filling in the legal world? Artificial intelligence should enrich professional judgement, he suggests, not replace it. What constitutes a “better way” to do law must be decided with a broad, social understanding of the systems that the law serves; not through applying technology for the sole sake of efficiency. There is massive room for computers to improve the way that we do law, but it must be approached through a critical lens.

Principles of algorithmic accountability must be interwoven into these systems. Models must be transparent, data must be unbiased and algorithms must be applied to appropriate tasks. Leading thinkers and the public alike must be able to critique these systems and lend their voices to the software’s development. Technology is strengthened when subjected to an open and free exchange of ideas and criticisms, so legal innovation must evolve with public accountability.

Prof. Pasquale left the room with a parting thought: the legal profession needs to carefully shape the algorithmic systems it deploys, or else those systems will shape the legal profession. Using the words of Douglas Rushkoff, Prof. Pasquale warns,

“Program or be programmed.”

 

Mike Noel is an IPilogue Editor and a JD Candidate at Osgoode Hall Law School.

The post Algorithmic Accountability: Prof. Frank Pasquale’s Thoughts on Artificial Intelligence in the Law appeared first on IPOsgoode.

]]>
Volkswagen v Garcia et. al.: Volkswagen Halts Disclosure of Secret Security Algorithm /osgoode/iposgoode/2013/09/24/volkswagen-v-garcia-et-al-volkswagen-halts-disclosure-of-secret-security-algorithm/ Tue, 24 Sep 2013 23:44:12 +0000 http://www.iposgoode.ca/?p=22521 Last June, Justice Birss of the High Court of England and Wales (Chancery Division) ruled in favor of Volkswagen and granted an interim injunction against Flavio Garcia, Computer Science Lecturer at the University of Birmingham, thus prohibiting him from publishing an academic paper that sought to expose weaknesses in Volkswagen automobile security systems. The paper […]

The post Volkswagen v Garcia et. al.: Volkswagen Halts Disclosure of Secret Security Algorithm appeared first on IPOsgoode.

]]>
Last June, Justice Birss of the High Court of England and Wales (Chancery Division) and granted an interim injunction against Flavio Garcia, Computer Science Lecturer at the University of Birmingham, thus prohibiting him from publishing an academic paper that sought to expose weaknesses in Volkswagen automobile security systems.

The paper disclosed the algorithm used to activate the security system, the Megamos Crypto chip, which Volkswagen uses for its vehicles. According to the facts, a group ofĚý academics - the parties to this lawsuit - were able to crack the security system and discover its flaws. However, the problem arose when these academics proposed to publish a paper at a conference, a paper which would reveal the algorithm to the public. Due to the confidential nature of theĚý information at stake, the defendants first notified Volkswagen, the proprietor of this information, prior to the paper's publication. Nonetheless, they did not inform Volkswagen until shortly before the date of the conference. Volkswagen contacted Garcia and his associates, requesting that they redact the vehicles’ security codes. The scientists refused to honour the request, arguing that the public has a right to see the weaknesses exposed. Volkswagen subsequently sought an injunction against the researchers on the grounds that revealing the codes used to activate the ignition systems would facilitate criminal activity.

Flavio Garcia and his associates purchased and used software called Tango Programmer, produced by a Bulgarian company called Scorpio. A central question in the case was whether the software used was legitimate. Justice Birss concluded that theĚý software was legitimate and the fact that it originated from Bulgaria had no significance in this respect. He further dismissed the claimant’s inference that the software's presentation in “broken English” as proving its illegitimacy.

The defendants contended that Volkswagen had no right to sue. According to the facts, the principal developer of the Megamos Crypto algorithm is the company Thales. Although not a party to the lawsuit, Justice Birss found that Thales is a "proper and necessary" party to the dispute and added them to the action. He went on to state that in following the decision of the court in , the confidentiality in the Megamos Crypto algorithm most likely belongs to Thales, as the algorithm's creator. Nevertheless, the court also found that Volkswagen had a legitimate interest in being a co-claimant.

The defendants also contended that Volkswagen had no claim to sue for misuse of confidential information. In making its ruling regarding reverse engineering, the court referred to . In that case, the court had ruled that it was not a misuse of confidential information to reverse engineer a product bought to acquire information encrypted for security. Judge Birss held that, in this case, there would be a breach of confidence because the legitimacy of Tango Programmer was successfully called into question by the claimants.

The defendants further contended that there was a strong public interest in the publication of the paper and that they had acted in accordance with responsible disclosure principles. Justice Birss considered , , and the Cream Holdings judgment. According to the court, the standard for not allowing publication is a flexible one, and that the court should be "exceedingly slow" to make interim orders if it is not satisfied that the claimant is likely to succeed at trial. For the court, there seemed to be a reasonable beliefĚý that either Thales or Volkswagen would most likely succeed at trial. This finding satisfied the first requirement.

As for the strong public interest argument, Justice Birss stated that freedom of expression and academic freedom are of major importance. However, in balancing freedom of expression with public safety, the court decided in favor of the latter. He stated,"I recognise the high value of academic free speech, but there is another high value, the security of millions of Volkswagen cars."

The judge granted the injunction sought by Volkswagen and ordered for "redaction" of the paper the defendants had written.

The present case is an illustration of the evolution of the society vis-à-vis the conservatism of the way law is applied. The court’s ruling, in my view, entails a significant future danger in that it places an obstacle for academics in the UK and abroad when it comes to conducting research and publishing about flaws in security systems. Judges around the world will eventually have to deal with cases like this one and may have to re-strike a balance between freedom of expression and confidentiality, potentially leading to a more responsive public or greater potential harm caused by disclosing secret security information of this nature.

Georgios Andriotis is an IPilogue Editor and a law student at Université de Montréal.

The post Volkswagen v Garcia et. al.: Volkswagen Halts Disclosure of Secret Security Algorithm appeared first on IPOsgoode.

]]>