Proptech: Managing Concerns About Privacy and Monitoring
Proptech users will need to comprehend and handle the associated privacy and other legal problems in order to implement and use surveillance technologies and associated data in a property environment.
Closed-circuit television and other surveillance technologies are now widely employed, and newer surveillance technologies like facial recognition technology are also becoming more popular. This technology must be acquired and kept with a sizable amount of personal data by its very nature.
Although the cutting-edge features that the technologies offer are practical and can enhance a building’s efficiency and security as well as the experience of its occupants, their use is not without risk. Building owners and users in particular need to make sure that any surveillance technology are used in a transparent, privacy-conscious, and legally acceptable manner.
What is the purpose of surveillance technology?
For instance, in a store or office setting, surveillance technologies might be utilized to:
- Improve access controls across the property to replace more conventional security measures (for instance, to eliminate the requirement for a concierge or security guard to watch entrance gates on the ground floor of a building);
- Observe and keep an eye on people, including to spot and stop any security threats and criminal conduct;
- To identify and analyze how places are being used, track foot movement. For instance, the technology might be utilized to help handle visiting room reservations or hot desk planning;
- Identify and keep an eye on dangerous spots inside structures; and
- Give building inhabitants access to real-time information on how spaces are being used (such as meeting rooms and end of trip facilities).
Privacy issues and additional legal factors
In so far as as it contains details about a “identified individual,” or a person who is reasonably identifiable (for instance, by a person’s facial features and other identifying characteristics, like tattoos), the collection of visual images and data through surveillance technologies will constitute personal information as defined by the Privacy Act 1988 (Cth) (Privacy Act).
Relevantly, the use of surveillance technology may entail gathering particular “sensitive information” as that term is defined in section 6 of the Privacy Act, such as health data, biometric templates, and biometric data intended for automated biometric verification or biometric identification. Under the Privacy Act and the related Australian Privacy Principles, sensitive information is given a higher level of protection (APPs).
Organizations that fall under the definition of “APP entities” under the Privacy Act and are exploiting sensitive information, such as face recognition and other surveillance technologies, must:
- Only use legal and ethical methods to gather this information;
- In line with APP 5, warn the person that their personal information is being collected; and
- Make sure that all data collection, usage, and disclosures are done in compliance with any other APPs.
Notably, APP companies are prohibited from gathering sensitive information about a person under the Privacy Act unless the person consents to the information collection and satisfies all other requirements under APP 3. There are also specific, excluding circumstances, such as when the collecting of sensitive information is mandated or permitted by or pursuant to an Australian legislation.
It is crucial to keep in mind that other laws, such as special workplace surveillance laws like the Workplace Surveillance Act 2005 and individual State and Territory privacy and surveillance device legislation, may also apply to the use of CCTV and facial recognition technologies (NSW). This Act controls employee surveillance in the workplace and stipulates that surveillance of an employee cannot begin without giving the employee prior written notice. Specific forms of workplace surveillance, including camera surveillance, are subject to additional and special regulations.
Recognizing bias and other mistakes
There is no entirely fault-proof technology. Concerns have been expressed about the possibility of algorithmic or user error, which may be racially or gender prejudiced, leading to the failure of face surveillance. Therefore, it is crucial that all applications of surveillance technology be continuously tested, monitored, and compliant with anti-discrimination laws. Every technological application should be thoroughly examined and validated by a human. Technology shouldn’t be the only source of information; it should just be a tool.
Users of surveillance technologies must carefully manage the cybersecurity and other security concerns associated with gathering enormous amounts of data that could be extremely beneficial to a hacker or other bad actor (you can get some tips on how to manage and mitigate cybersecurity risks in a proptech context here). Furthermore, information obtained through surveillance technologies must not be used for any illegal or illegitimate activity, such as tracking a current or former partner.
The future route
We are likely to see more focused oversight and regulation of the field as newer technologies, such as the usage of facial recognition technologies, become more common and the subject of substantial media reportage.
The adoption of facial recognition technology by large retail behemoths like Kmart Australia and Bunnings has already been the focus of substantial media coverage and worries from consumer organisation, CHOICE. Bunnings allegedly employed facial recognition technology to help identify people who had previously been involved in instances of concern within their stores, and Kmart Australia allegedly used it to prevent fraud and criminal activity. Both Bunnings and Kmart Australia are said to have stopped using facial recognition software as of late.
The use of facial recognition technology by these organizations is presently the subject of an inquiry by the Office of the Australian Information Commissioner (OAIC). This follows its finding in 2021 that 7-Eleven was unlawfully collecting sensitive biometric data (through facial imaging while asking customers about their in-store experiences) and that this violated their right to privacy because it was not reasonably necessary for 7-Eleven’s operations and was done without giving customers enough advance notice as required by the APPs. The Commissioner determined that because facial pictures and faceprints are biometric data used for automated biometric identification and because faceprints are biometric templates, they are sensitive information under the Privacy Act.
Facial Recognition Technology: Towards a Model Law, a report recently released by the University of Technology Human Technology Institute, suggests using a risk-based strategy for using and deploying facial recognition software. In order to determine the total risk level for a certain face recognition technology application, the model law would compel developers and deployers of facial recognition technology to assess human rights vulnerabilities both separately and collectively. This extends beyond the general privacy considerations under current law and forces users to think more broadly about the use and application of facial recognition technology, taking into account elements like where the technology is deployed, the ability of each application to produce trustworthy results, and whether or not the affected individuals have had the opportunity and capacity to give their free and informed consent to the collection of their facial data. In addition to determining whether the technology can be implemented and, if so, the degree of limits that will be placed on its use, the results of this assessment will also determine the risk rating for the applicable face recognition application. The model law would forbid high-risk uses unless there were unique conditions, including restricted law enforcement.
Incorporating surveillance technology into your proptech
In a proptech setting, property owners and users of monitoring devices ought to:
- Carefully consider how they use the technology, taking into account any possible privacy implications. Ideally, this assessment should be conducted before the technology is used;
- Considering the type of consent they seek and the methods by which they do so while collecting personal data via surveillance technologies;
- Make sure that a suitable form of such notice is utilized that satisfies all legal requirements, including those under APP 5 of the Privacy Act, and that clear, transparent notice is provided on the use of technologies and the gathering of personal information;
- Ensuring that all applicable laws are complied with by the technologies, as well as the systems and procedures associated with their use, and constantly evaluate those laws to ensure that the technology continues to be compliant with any legislative advancements;
- Manage and reduce cybersecurity risks, and make sure that all personal data produced by the use of surveillance technologies is safeguarded from loss, theft, misuse, and unauthorized access;
- Draft specific clauses that effectively control the legal and other risks involved with deploying the technology in any pertinent contractual agreements (such as a lease agreement between a tenant and landlord for a building that uses surveillance equipment). For instance, it can be worthwhile to take into account including privacy clauses as well as contractual clauses that specify who is in charge of maintaining and managing the technology.