Since the release of the Chrome 80 update, working with cookies became more difficult, from developing in localhost environments, to always requiring secure contexts.
I was developing a project of my own for an enterprise I have with a partner, we were developing both front-end and back-end. The back-end was already done, all we were missing was the front-end, but when we assumed everything was working we tried the dev release and the login button was actually successful, but we were being redirected back at login.
The problem was the browser not storing the cookies, there was no error anywhere, simply cookies not being stored. A lot of 401 (unauthorized) status codes and also a lot of frustration.
We were requesting cookies from another origin from an unsafe (and invalid) context.
The truth is that localhost is not convenient for any security measure, the name is not even a valid domain. As per
the rfc 3986
a valid domain should consist from at least
1 dot and two part, such as "example.com" where "example" is 1 part and "com" is another. but localhost doesn't have that two parts.
While I was trying to solve this issue I tried a lot of things, such as appending a name before localhost such as "project-name.localhost" which technically works but the problem comes from requesting that from a secure context, and it also adds a lot of overhead, because cors needs to be changed dynamically to accept Cross-Site cookies.
Then I thought, there is layers to that, in fact there is layers to everything, so we can possibly proxy the request and make it Same-Origin?
There is a lot of ways you can proxy a request for it to be same origin, since all the website front-ends
have something that is serving the files we can use the same thing to redirect requests, so my solution was
in that case use the vite
proxy to trick the browser and make it think we are requesting
to the same origin.
Developing the solution in depth, we can find ourselves writting some sort of schema like this:
After we got some solution the usual next step is making it real. So let's get to code it.
In my case I was using vite for the dev server and nginx for the production server in a docker container, so I will provide both my solutions in this case
For the dev server I simply added a proxy in the vite.config.ts
file like this
export default defineConfig({
// ...
server: {
proxy: {
"/api": {
target: "https://api.example.com/",
changeOrigin: true,
rewrite(path: string) {
return path.replace(/^\/api/, "");
}
}
}
}
// ...
});
Then I simply changed all of my fetch calls to call the API instead, with the same parameters, nothing had to change.
Since I usually use a provider to store global data such as the API url so I simply changed the parameter from the api
url to the proxy path, basically from https://api.example.com/
to /api
.
<ApiServiceProvider baseUrl="/app">
<App />
</ApiServiceProvider>
And the development environment works perfectly with this, now the cookie is being saved.
For the production environment I added a few lines to the default.conf
nginx file that was going
to be set in the production docker container.
server {
listen 80;
location ~ /api/(.*) {
proxy_pass https://api.example.com/$1$is_args$args;
proxy_set_header Host $http_host;
}
location / {
root /app/frontend;
try_files $uri /index.html;
}
}
In this case I don't listen to 443 in here as I leverage the work to the kubernetes traefik, as it handles the requests and gives me unencrypted data to work with in my end.
Well, that's everything I have to add to this problem, I believe the update wasn't made to be friendly with developers, probably with developers that have big development environments and also these who can self sign certificates and the browser implicitly allows them.
Thanks for reading ;)
๐๐ Pretty good!!!