Cryptographically protecting your SPA

Matheus Adorni Dardenne - Mar 17 '23 - - Dev Community

Cool image about cryptography

Credits to https://blog.1password.com/what-is-public-key-cryptography/ for the cool image.

TL;DR:

Check this repository for a simple example in NextJS of how to achieve this. Reading the article is recommended, though, for context on why this is useful. Don't forget to give a star to the repository 😁.

Disclaimer

Despite having worked as a software engineer for the past decade, I am not a cryptographer and am not a cybersec specialist. I’m sharing this from the perspective of a developer who was tasked with fixing a bug. I recommend doing your own research on the subject, and always inviting ethical hackers to pentest your applications. Always rely on experts when it comes to security.

Introduction

Recently the application I’ve been working on for little more than a year went through a “pentest” (Penetration Test, where hired ethical hackers will try to invade your application and report your weaknesses, so you can fix them. This is a very useful tactic for cybersecurity). It was the first time this system was put through such a procedure.

The System

The system is comprised of a front-end SPA built with ReactJS, and a back-end API built with Node.JS. As a software engineer with some 10 years of experience under my belt, I designed both to be resistant to the usual culprits.

I won’t focus on those, but I recommend you to extensively research any of the above terms you’re not familiar with. I was confident, but I was in for a wild ride.

The Report

All of these security measures were praised on the final report. However, there was one attack that was able to get through; a particular form of man-in-the-middle attack that allowed the hacker to escalate his access level.

The application itself is protected using SSL certificates on both ends, so the data was pretty secure while in transit. However, the hacker used a specialized tool called Burp Suite to set up a proxy on his machine using the certificate on his browser. This proxy routes the network requests to and from the tool, and makes both ends believe it is legitimally coming from each other. This allowed him to modify any data he wanted.

The Attack

He could effectively fake what the API was sending back to the browser, or fake what the browser was sending to the API. So it isn't exactly a... man... in the middle. It wasn't a third-party stealing or changing the information, but is was still a new layer in between that allowed for an attacker to do things the application probably isn't expecting him to be able to do, and this can break things.

I have never seen such an attack before. I didn't even think this was possible. My fault, really, as the hacker said this is a very common vector of attack of SPAs, which must rely on information passing through the network to determine what the user can see and do (such as showing up a button that only an admin should see, for example).

From there, all the hacker had to do was figure out what-is-what in the responses to make the browser believe he was an admin (for example, changing an "isAdmin" property from "false" to "true"). Now he could see some things he wasn’t supposed to see, such as restricted pages and buttons. However, since the back-end validates if the person requesting administrative data or performing administrative actions is an admin, there wasn’t much he could do with this power... we thought... that was until he found a weakspot.

It was a form that allowed us to quickly create new test users. It was a feature no normal users were supposed to ever see, and one that was supposed to be removed after development, so we never bothered protecting it, and since the body of the request was specifically creating a "normal user", we never stopped to think about the security implications. It was never removed, we forgot about it.

Then the hacker used the proxy to modify the body of the request, and managed to create a new user with true admin power. He logged in with this new user and the system was in his hands.

I know, it was a bunch of stupid mistakes, but are all your endpoints protected? Are you SURE? Because I was “pretty sure”. Pretty sure is not enough. Go double-check them now.

The Debate - Damage Control

Obviously, the first thing we did was deleting his admin account and properly gating the endpoint he used to create the user, requiring admin access and preventing it from accepting the parameters that would give this new user admin access. Turns out we still needed that form for some tests and didn't want to delete it just yet. We also did a sweep on other endpoints related to development productivity to confirm they were all gated behind admin access, and fixed those that weren't.

The Debate - SSR?

The cat was out of the bag. We needed a solution. We still had to prevent attackers from seeing pages and buttons they weren't supposed to see. Moving the whole React app to a NextJS instance was considered, so we could count on the SSR for processing the ACL. Basically, we would check the components the user should be able to see on the server side, this information would not be sent through the network, so it couldn’t be faked. This is likely the best approach to solving this, and it will be done in the near future, but that will be very time-consuming (and isn't always viable) and we needed a solution fast.

The Debate - What would the solution even look like?

So, we needed a way to verify that the message sent by the API was not tampered with. Obviously we needed some form of cryptography. Someone suggested HMAC, but the message couldn’t simply be encrypted using a secret shared on both sides, because since the hacker had access to the source code on his browser, he could easily find the secret and use it to encrypt any tampered response, so something like HMAC (and pretty much any form of symmetric cryptography) was out of the gate. I needed a way to sign a message on one side, with the other side being able to verify that the signature is valid, without this other side being able to sign a message.

The Debate - The solution

Then we realized: this sounds a lot like the public-private key pair, like the ones we use for SSH! We will have a private key that stays on the environment of the API, which we will use to sign the response, and a public key that is compiled in the front end to verify the signature. This is called asymmetric cryptography. BINGO! We would need to implement something like RSA keys to sign and verify the messages. How difficult could it be? Turns out… very difficult. At least if you, as me then, have no idea how to even start.

The implementation - Creating the keys

After hours of trial and error, using several different commands (such as using ssh-keygen and then exporting the public key to the PEM format), I managed to find the commands that create the keys properly. I’m not a cryptographer and can’t explain in detail why the other commands I tried were failing later in the process of importing the keys, but from my research I could conclude that there are several different “levels” of keys, and the ones used for SSH are not the same “level” as the ones created by the working command.

These are the ones that worked.
For the private key:
openssl genrsa -out private-key-name.pem 3072
For the public key:
openssl rsa -in private-key-name.pem -pubout -out public-key-name.pem

You can change the number of bits in the first command, they represent the number of bits that the prime numbers used in the algorithm will have (which is a gigantic number), but keep in mind that you will have to change some other things later.
As a rule of thumb, more bits = more security but less speed.

The implementation - The Back-end

Implementing this on the back-end was very straightforward. NodeJS has a core library named crypto, that can be used to sign a message with few lines of code.

I wrote a simple response wrapper to do this. It expects an input that looks something like this:
{ b: 1, c: 3, a: 2 }
And its output will look something like this:

{
  content: { b: 1, c: 3, a: 2 },
  signature: "aBc123dEf456"
}
Enter fullscreen mode Exit fullscreen mode

But I immediately ran into problems, which I’ll quickly go through, as well as briefly explain how I solved them.

  • When you stringify javascript objects into JSON, they don’t always keep their “shape” letter-to-letter. The content remains the same, but sometimes, properties appear in a different order. This is expected behavior for JSON and is documented in its definition, but if we are going to use it as a message to be signed, it MUST be equal, letter to letter. I found this function that can be passed as the second argument to JSON.stringify to achieve exactly what we need; it orders the properties alphabetically, so we can count they will always be stringified in the correct order. This is what the function looks like.
export const deterministicReplacer = (_, v) => {
  return typeof v !== 'object' || v === null || Array.isArray(v) ? v : Object.fromEntries(Object.entries(v).sort(([ka], [kb]) => {
    return ka < kb ? -1 : ka > kb ? 1 : 0
  }))
}

const message = JSON.stringify({ b: 2, c: 1, a: 3 }, deterministicReplacer)
// Will always output a previsible {"a":3,"b":2,"c":1}
Enter fullscreen mode Exit fullscreen mode
  • Just to avoid dealing with quotes and brackets, that were causing headaches due to sometimes being “escaped” in some situations, resulting in different strings, I decided to encode the whole stringified JSON into base64. And this worked initially.
Buffer.from(message, 'ascii').toString('base64')
Enter fullscreen mode Exit fullscreen mode
  • Later I had problems because I was reading the encoding of the input string as ASCII, turns out that if the message contains any character which takes more than 1 byte to encode (such as an emoji or bullet point), that process would produce a bad signature that the front-end was unable to verify. The solution was using UTF-8 instead of ASCII, but this required modifications to how things were being processed in the front end. More on this later.
Buffer.from(message, 'utf-8').toString('base64')
Enter fullscreen mode Exit fullscreen mode

This is what the final working code for the back end part looks like:

import crypto from 'crypto'
import { deterministicReplacer } from '@/utils/helpers'

export const signContent = (content) => {
  const privateKey = process.env.PRIVATE_KEY
  if (!privateKey) {
    throw new Error('The environmental variable PRIVATE_KEY must be set')
  }
  const signer = crypto.createSign('RSA-SHA256')

  const message = JSON.stringify(content, deterministicReplacer)
  const base64Msg = Buffer.from(message, 'utf-8').toString('base64')
  signer.update(base64Msg)

  const signature = signer.sign(privateKey, 'base64')

  return signature
}

export const respondSignedContent = (res, code = 200, content = {}) => {
  const signature = signContent(content)
  res.status(code).send({ content, signature })
}
Enter fullscreen mode Exit fullscreen mode

The implementation - The front-end

The plan was simple:

  1. Receive the response with the content and the signature.
  2. Deterministically stringify the content (using the same deterministicReplacer function we used in the back-end).
  3. Encode it in base64 as an UTF-8 string, just like in the backend.
  4. Import the public key.
  5. Use the public key to verify this message against the signature in the response.
  6. Reject the response if verification fails.

I searched around for libraries like crypto for the front-end, tried some of them, but in the end came up empty-handed. It turns out this library is written in C++, and can’t run on the browser, so I decided to use the native Web Crypto API, which seems to work well on modern browsers.

The code for steps 1-3 is quite long and uses a few nearly unreadable functions I found around the internet and then modified and combined in a way to normalize the data in the format that is needed. To see it fully, I recommend going directly to the files rsa.ts and helpers.ts.

For steps 4-5, I studied the WCAPI docs to figure out that the function to import the public key expects the data to be in the form of an ArrayBuffer (or others, check docs for reference). The keys naturally come with a header, a footer, and a body encoded in base64 (which is the actual content of the key), this one is encoded as ASCII so we could just use the window.atob function. We need to strip the header and footer, and then decode it to get to its binary data.

This is what it looks like in code.

function textToUi8Arr(text: string): Uint8Array {
  let bufView = new Uint8Array(text.length)
  for (let i = 0; i < text.length; i++) {
    bufView[i] = text.charCodeAt(i)
  }
  return bufView
}


function base64StringToArrayBuffer(b64str: string): ArrayBufferLike {
  const byteStr = window.atob(b64str)
  return textToUi8Arr(byteStr).buffer
}


function convertPemToArrayBuffer(pem: string): ArrayBufferLike {
  const lines = pem.split('\n')
  let encoded = ''
  for (let i = 0; i < lines.length; i++) {
    if (lines[i].trim().length > 0 &&
      lines[i].indexOf('-BEGIN RSA PUBLIC KEY-') < 0 &&
      lines[i].indexOf('-BEGIN RSA PRIVATE KEY-') < 0 &&
      lines[i].indexOf('-BEGIN PUBLIC KEY-') < 0 &&
      lines[i].indexOf('-BEGIN PRIVATE KEY-') < 0 &&
      lines[i].indexOf('-END RSA PUBLIC KEY-') < 0 &&
      lines[i].indexOf('-END RSA PRIVATE KEY-') < 0 &&
      lines[i].indexOf('-END PUBLIC KEY-') < 0 &&
      lines[i].indexOf('-END PRIVATE KEY-') < 0
    ) {
      encoded += lines[i].trim()
    }
  }
  return base64StringToArrayBuffer(encoded)
}
Enter fullscreen mode Exit fullscreen mode

The final code to import the key looks like this:

const PUBLIC_KEY = process.env.NEXT_PUBLIC_PUBLIC_KEY


const keyConfig = {
  name: "RSASSA-PKCS1-v1_5",
  hash: {
    name: "SHA-256"
  },
  modulusLength: 3072, //The same number of bits used to create the key
  extractable: false,
  publicExponent: new Uint8Array([0x01, 0x00, 0x01])
}


async function importPublicKey(): Promise<CryptoKey | null> {
  if (!PUBLIC_KEY) {
    return null
  }
  const arrBufPublicKey = convertPemToArrayBuffer(PUBLIC_KEY)
  const key = await crypto.subtle.importKey(
    "spki", //has to be spki for importing public keys
    arrBufPublicKey,
    keyConfig,
    false, //false because we aren't exporting the key, just using it
    ["verify"] //has to be "verify" because public keys can't "sign"
  ).catch((e) => {
    console.log(e)
    return null
  })
  return key
}
Enter fullscreen mode Exit fullscreen mode

We can then use it to verify the content and signature of the response like so:

async function verifyIfIsValid(
  pub: CryptoKey,
  sig: ArrayBufferLike,
  data: ArrayBufferLike
) {
  return crypto.subtle.verify(keyConfig, pub, sig, data).catch((e) => {
    console.log('error in verification', e)
    return false
  })
}

export const verifySignature = async (message: any, signature: string) => {
  const publicKey = await importPublicKey()

  if (!publicKey) {
    return false //or throw an error
  }

  const msgArrBuf = stringifyAndBufferifyData(message)
  const sigArrBuf = base64StringToArrayBuffer(signature)

  const isValid = await verifyIfIsValid(publicKey, sigArrBuf, msgArrBuf)

  return isValid
}
Enter fullscreen mode Exit fullscreen mode

Check the files rsa.ts and helpers.ts linked above to see the implementation of stringifyAndBufferifyData.

Finally, for step 6, just use the verifySignature function and either throw an error or do something else to reject the response.

const [user, setUser] = useState<User>()
const [isLoading, setIsLoading] = useState<boolean>(false)
const [isRejected, setIsRejected] = useState<boolean>(false)

useEffect(() => {
  (async function () {
    setIsLoading(true)
    const res = await fetch('/api/user')
    const data = await res.json()

    const signatureVerified = await verifySignature(data.content, data.signature)
    setIsLoading(false)
    if (!signatureVerified) {
      setIsRejected(true)
      return
    }
    setUser(data.content)
  })()
}, [])
Enter fullscreen mode Exit fullscreen mode

This is obviously just an example. In our implementation we wrote this verification step into the “base request” that handles all requests in the application and throw an error that displays a warning saying the response was rejected in case the verification fails.

And that’s how you do it. 😊

Notes on Performance

We thought this could heavily impact the performance of the API, but the difference in response times was imperceptible. The difference we measured in response times was on average less than 10ms for our 3072-bit key (and the average was a bit less than 20ms for a 4096-bit key). However, since the same message will always produce the same signature, a caching mechanism could easily be implemented to improve the performance on “hot” endpoints if this becomes a problem. In this configuration the signature will always be a 512-byte string, so expect the size of each response to be increased by that much, however, the actual network traffic increase is lower due to network compression. In the example, the response for the {"name":"John Doe"} JSON ended up with 130 bytes. We decided it was an acceptable compromise.

The Result

The same ethical hacker was invited to try to attack the application again, and this time, he was unable to. The verification of the signature failed as soon as he tried to change something. He messed around with it for a couple of days and later reported he couldn’t break this. The application was declared sufficiently secure... for now.

Conclusion

This works, but I'm not going to lie: not finding comprehensive material on how to do this for this purpose made me question if this is even a good solution. I thought of sharing this mostly as a way to have it analyzed and/or criticized by wiser people than myself, but more importantly, as a way to warn other developers of this attack vector. I also wanted to help others implement a possible solution for this, since it took me a couple of days of trial and error until I was able to figure out how to make everything work together. I hope this saves your time.

All of this has been condensed into a simplified approach in NextJS and is available in this repository.

Please leave a star on it if you find it helpful or useful.

Please feel completely free to criticize this. As I said, I am not a cryptographer or a cybersec specialist, and will appreciate any feedback.

. . .
Terabox Video Player