Privacy in a pandemic – the puzzle of the COVID-19 notification solution

In the early days of the COVID-19 pandemic, when we all struggled with an avalanche of concepts such as “social exclusion” and “personal protective equipment,” the idea of ​​seeking contacts had many who scratched their heads. In the age of automation, a manual process involving a team of people phoning and conducting interviews to find those exposed to the virus can seem almost counter-intuitive. Perhaps this is part of the reason why in 2020 we saw a dedicated effort to find digital solutions to the challenges of COVID-19. This includes digital contact search solutions, such as the Australian Government’s COVIDsafe app. However, the adoption of visitor registers that record which persons visited certain locations and their contact details were much more useful for contacting seekers.

After initial blockades, in an effort to get businesses like retailers, bars and restaurants back to work, a number of digital sign-up solutions have emerged to help businesses meet contact requirements. These systems provide QR codes, allow customers to log in, keep records and, if necessary, disclose personal information to tracking contacts. Many also seek consent for direct marketing, either as an option or with attached consent (i.e., the customer must agree to receive marketing or cannot apply). In some states, such as New South Wales (NSW), electronic login services are now mandatory.

But what about privacy? A recent survey conducted by the Consumer Policy Research Center found that 94 percent of Australians are concerned about the way their personal data is shared online. Can users trust the cavalcade of new application vendors? Are these service providers regulated? What happens to the data? Does maintaining mutual security mean waiving our right to privacy – not following us unnecessarily or not wanting unwanted marketing? And how do organizations that use sign-in applications manage the risks posed by this new genre of service provider?

WHICH LAWS APPLY?

Private sector organizations are covered by the Australian Federal Privacy Act, the Privacy Act 1988 (Cth) (Privacy Act). However, Section 6D states that the Act does not apply to “small businesses,” which it defines as enterprises with an annual turnover of less than $ 3 million (with some exceptions).

So, do COVID-19 application vendors fall into this definition of a small business operator? The answer is: it depends. Some are, and are not even regulated by the Privacy Act. As such, they could use login information for other purposes – such as marketing or analytics, to profile or sell to third parties – all without consequence.

Let’s take a closer look.

Paid applications

Generally speaking, paid sign-up application providers generate QR codes so that users can get to the right page. Vendors then collect and store users ’personal information and disclose it to tracking contacts when needed. Users generally give their explicit consent to this data collection, and the company using the service provider pays a monthly fee.

Some service providers are large, established businesses and may have an annual turnover of more than A $ 3 million, which excludes them from the small business category and subject them to compliance with the Privacy Act.

Other service providers are startups and may not make much money, but by running the COVID-19 login application they are likely to “disclose personal information … for a benefit, service or advantage”. This is commonly referred to as “personal data trading” and excludes them from being considered small business entities under Section 6D (4), unless they have the consent of the affected individuals (Section 6D (8)) – which, to a large extent, they do. . Application forms usually include a request for consent and no one is forced to dine in a restaurant or sit in a cafe – he decides. Based on all of this, these smaller service providers continue to qualify as small businesses, exempt from the provisions of the Privacy Act.

Free applications

Some login applications may be used free of charge, so the provider does not receive a “benefit, service, or advantage” for detecting or otherwise handling login records. Many of these service providers are likely to be low-income startups, so they could also be considered small businesses in terms of the Privacy Act.

The exemption for small businesses represents a significant gap in the coverage of the Privacy Act, which allows organizations to collect large amounts of personal data but remain exempt from privacy regulations.

Australia is unusual in this regard. Other privacy regimes, such as the European Union General Data Protection Regulation, do not have a similar exception.

HOW CAN WE BE PROTECTED TO PRIVACY RIGHTS?

Governments of states and territories, including NSW, Victoria, South Australia, Western Australia, Tasmania, Northern Territory and the Australian Territory of the Capitol, are now offering their free application solutions. They are not covered by the Privacy Act, but their handling of personal data is regulated by the privacy legislation of their country. They also provide users with clear and specific security assurances, guaranteeing that they will not use the data for secondary purposes, such as marketing, and promising to delete the data after 28 days if they are not needed to search for contacts.

To date, there have been no legislative actions at the federal level to ensure that reporting service providers report privacy laws. As of January 1, 2021, NSW requires all restaurants and hairdressers to use their free application application developed by the state. Other state and territory governments warmly recommend companies to use their application applications, but have not ordered it. The Australian Privacy Regulator – the Office of the Australian Information Commissioner – recently completed its consultation on draft guidelines for solutions for applying to COVID-19. The current (draft) recommendation is that companies choose their application applications carefully – especially to use application providers subject to the Privacy Act – and that other service providers should voluntarily choose to cover the Privacy Act (under section 6EA).

It is also noticeable that Australia is currently moving towards a revision of the Privacy Act. A release document has been published to seek community input in the review. One of the questions posed in the paper is whether the exemption for small businesses should be amended. Indeed, recent experience with multiple application solutions highlights why this issue is more relevant than ever. The small business exemption is likely to be considered in detail in the review; whether it is still relevant in the digital age, or whether supplementing or removing it could create an unreasonable impost for small businesses or stifle innovation. Significantly, the OAIC itself recommended removing the exemption for small businesses.

Why is this important?

For any digital service to be effective, the public must be able to trust it. Creating the conditions for trust requires transparency, clear rules and clear consequences for breaking the rules. If we can’t establish a reliable relationship with users, users can act to protect themselves – for example, by giving fake names or contact information, which ultimately makes them less and less secure.

Ensuring privacy protection – and is considered protected – is key to building that trust. As we continue to develop digital solutions to support public health issues, trust must be a key issue in the design process.

How can organizations build trust?

The first step is to ensure that your organization uses a reliable COVID-19 application solution. Customers do not necessarily distinguish between an organization and a service provider. If a service provider violates the Privacy Act, it may adversely affect the organization that hired it. Depending on the solution, you may want to consider assessing the privacy and security performance of your service provider, how personal information will go through the solution, and how your organization will relate to it. This can help you proactively identify risks and incorporate protection.

If your organization wants to build relationships with trusted customers, you need to start with transparency. Provide your customers with clear, concise, and readable privacy messages whenever you ask for information to understand what information you are collecting, why it is needed, and how it will be used. You should support this with a more comprehensive (but still simple language) privacy policy so that customers can find out more if they wish.

Behind the scenes, consider whether your policy framework and internal processes are sufficient. It’s not enough to just ask customers to trust you; you need to be able to show that you are reliable. As an organization, consider whether you are handling personal information in a responsible and consistent manner. Have you implemented policies and processes that allow you to adhere to Australian privacy principles, as required by APP 1 (i.e. open and transparent management of personal data)? Are these policies and processes effective?

Source