The stls from CadQuery and OpenSCAD are not compressed and so we're
throwing away bandwidth and taking a performance hit by not gziping.
Gzip for s3 basically needs to be gziped before upload and than have
'content-type' : 'text/stl'
'content-encoding' : 'gzip'
set.
https://stackoverflow.com/questions/8080824/how-to-serve-gzipped-assets-from-amazon-s3
The obvious part that needs to change is putObject in
app/api/src/docker/common/utils.js but there might be a few more
nuances.
resolves#391
I've been able to get a proof of concept of downloading a openscad
library when the docker image builds
https://twitter.com/IrevDev/status/1400785325509660678
Since its experimental atm I'll leave it with just the one for now.
I've also got a local dev working again for the cad lambdas.
Resolves#338
Doing so has a number of benefits
- Overcome the 10Mb limit of the API gateway the lambdas have to go
through
- By storing the key as the hash of the code we can return previous
generated assets, i.e. caching
- cost, transfering assets into the bucket within the AWS ecosystem
is faster than return, and there fore the lambdas execute for less time
- Sets us up for the future as when generating artifacts for repos when
there is a change to master etc we want to store these assets somewhere
and s3 is an obvious choice
- Solved a weird CORS issue where I couldn't get CORS working with
binaryMediaTypes enabled, don't need binary types when dumping in s3
Resolves#316