You are currently viewing Interpreting California’s IoT Security Law SB 327

Interpreting California’s IoT Security Law SB 327

  • Post author:

In our last post we talked about California’s SB 327 Information privacy: connected devices law. Today we’re going to discuss what it takes to comply with the law. Our conversations at CES 2018 have revealed much confusion on this point. Please keep in mind that we are not attorneys and you should seek council to mitigate your risk until there’s more clarity on compliance such as approved certification programs.

In essence, SB 327 requires that a connected device have “reasonable security” and then gives some nebulous language as to what that means. We believe this language can be boiled down to the following:

1. The strength of a device’s security system must match the application. For example, reasonable security for a soil monitor is not as strong as reasonable security for a surveillance camera.  

2. The security system must protect against hackers accessing and modifying the device.

3. If a device can be accessed from the public Internet, it can comply with the law by either a) having a default password unique to the device (e.g.- you can’t flash devices with the same password), or b) requiring the user to set the password on first access.

That gives a little more guidance, but we need more so let’s look at how the law originated. SB 327 is directly related to California’s new privacy laws, as mentioned in the beginning digest. So, a device must prevent information from being compromised. SB 327 was also enacted to prevent devices from being incorporated into a botnet such as Mirai- hence requirement #2 above. Let’s look at the various elements of device security with these insights in mind.

Communication:

In order for a device to communicate securely it must a) encrypt the data it is sending, b) use a communication link that authenticates both the device and the recipient, and c) correctly authorize the sender and receiver (typically done by the application). For example when a device sends data to a server, the device must authenticate and authorize the server, the server must authenticate and authorize the device, and communications between the device and server needs to be encrypted. The most common way to secure communications is with TLS (which when used in conjunction with HTTP is called HTTPS). Unfortunately the default TLS configuration does not authenticate the sender, so you must take additional steps to comply with SB 327.  

There are several mechanisms you can use to authenticate the client (sender) of a TLS session. The HTTPS call could include a device ID parameter and/or a shared secret (e.g.- “symmetric”) key. A recent trend is to leverage the OAuth2 protocol and embed a token in the device for communicating with the server. The problem with these mechanisms is that all devices are compromised if a hacker accesses the IDs/keys/tokens stored on the central server. This can be mitigated by using a hardware security module to store secret keys in the cloud but this is tough to scale.  

Fortunately there is an alternative that is not vulnerable to server breaches- asymmetric or public key cryptography. TLS uses asymmetric keys when the client authenticates the server. TLS gives the server the option to use asymmetric keys to authenticate the client. To implement TLS client authentication with asymmetric keys, each device must have an x.509 certificate that is derived from a secret (“private”) key that stays on the device. The server only needs the public certificate to authenticate the client- it does not need to store a symmetric secret key. Given SB 327’s origins in privacy we believe client authentication with certificates (also called “PKI”) is the safest path to meet SB 327 requirements.

Certificates:

Not all certificates are created equal. First, there are many types of certificates and only a few are approved by the National Institute of Standards and Technology (NIST). Second, certificates are only as strong as the “root” private key. If you create your own root key and sign your certificates with this root key (called “self signing”) your system is completely useless if your root key is exposed. If a hacker gets possession of your root key they can sign bogus certificates for rogue devices and your server will not know the difference. The root keys of certificates commonly used on the Internet are closely guarded in offline mountain bunkers and are only accessed using audited procedures (called FIPS for those who want to know). There are companies you can hire that have these bunkers and procedures to store your root key. Manufacturers as big as Apple and Samsung use them. Hosting your own root key is probably fine for SB 327 for small companies, but if your root key gets compromised and you have high value assets or data, you’ll have to explain in court why you chose a root key implementation that does not match the value of your application.  

Device Access:

SB 327 devotes a lot of language to accessing devices because if you do it poorly your devices could be taken over by hackers. #3 above addresses devices that can be access remotely, but SB 327 also covers devices that can be accessed on-site. Devices without a screen rely on a phone app to configure the device. Most commonly, the device sets up its own WiFi access point with a pre-programmed SSID and the phone app knows to connect to that SSID. Very often the manufacturer programs all devices with the same SSID. This isn’t nearly as bad as using the same password for remote login because a hacker needs to be in WiFi range of the device when the device is in “AP mode” to access it so this mechanism is probably compliant with SB 327, but a more secure implementation would at least use a unique password for the AP, which can be printed for the user to enter into the phone app.

Server Login:

SB 327 does not address how authentication and authorization are done for websites, but there are best practices around how to store passwords and how to implement OAuth 2 for Google and Facebook login. You need to either learn about how to implement it correctly or outsource this to a company that specializes in it (which is exactly what Wivity does).  

Hardware Secure Elements:

Much of the focus in IoT security is around hardware secure elements. This is a piece of hardware that safely stores the device private key and other secret information. SB 327 does not require secure elements, but does say the security must be appropriate to the nature, function, and information of the application. If your devices sends highly valuable data and is in a location that is easily accessible by hackers you might want to consider using a secure element to protect your device’s private keys. Payment terminals, for example, are required to use secure elements to comply with PCI in many cases.

IEEE 2030.5 and CA Rule 21:

Reasonable security in the distributed energy space should be interpreted with knowledge of currently existing standards such as IEEE 2030.5. IEEE 2030.5 specifies TLS 1.2 with ECC secp256r1 client certificates and Rule 21 requires them to be signed by SunSpec’s infrastructure. Even though Rule 21 only specifies security requirements for communications directly with the utility server, we believe SB 327 will hold exporting DER devices to the same standard as IEEE 2030.5 since that is already accepted by the government regulators. If you are a DER manufacturer and plan to use an aggregation service to comply with Rule 21, make sure the aggregation service uses the right level of security. Using a service that does not use a security level as high as that specified in IEEE 2030.5 and Rule 21 (e.g.- TLS client authentication with certificates of 128 bits of security signed by a reputable signing authority) is a risk.

Conclusion:

SB 327 is a nebulous law, much like HIPAA. The healthcare industry solved the uncertainty with HIPAA by creating the industry group HITRUST that drafts unambiguous specifications and certification programs that companies can follow to comply with HIPAA-style laws around the world. We believe there needs to be a similar industry certification for IoT security, but we seriously doubt one will be created by SB 327’s January 2020 compliance date. If you need guidance now, hopefully this post will help.

Alfred Tom

Alfred Tom is co-founder and CEO of Wivity, Executive Director of OMA3.org, and president of Lumian.org (a SunSpec.org spinout). He also serves on the board of CCCsf.us, an internationally-acclaimed Warhol/Bloomberg-funded contemporary art organization in San Francisco.