1 Commits

Author SHA1 Message Date
Kurt Hutten
a78196b496 Github app start 2021-05-06 20:07:24 +10:00
395 changed files with 13702 additions and 26486 deletions

3
.gitmodules vendored
View File

@@ -0,0 +1,3 @@
[submodule "web/src/cascade"]
path = app/web/src/cascade
url = https://github.com/Irev-Dev/CascadeStudio.git

11
.vscode/settings.json vendored
View File

@@ -1,11 +0,0 @@
{
"cSpell.words": [
"Cadhub",
"Customizer",
"Hutten",
"cadquery",
"jscad",
"openscad",
"sendmail"
]
}

View File

@@ -1,67 +0,0 @@
Hello 👋
Really happy you're checking out how to contribute.
Here you'll find a break down of the tech we're using,
If you'd like to get involved one of the best ways is to drop by the [discord](https://discord.gg/SD7zFRNjGH), say hi and let us know you're interested in contributing. All are welcome.
## Tech used
### Redwood
CadHub is a [RedWood app](https://redwoodjs.com/). Simplistically this means it's a React frontend, using a serverless graphQL backend with Prisma.
We are also using [Tailwind](https://tailwindcss.com/) to style the app.
To learn more about Redwood, here are some useful links:
- [Tutorial](https://redwoodjs.com/tutorial/welcome-to-redwood): getting started and complete overview guide.
- [Docs](https://redwoodjs.com/docs/introduction): using the Redwood Router, handling assets and files, list of command-line tools, and more.
- [Redwood Community](https://community.redwoodjs.com): get help, share tips and tricks, and collaborate on everything about RedwoodJS.
### Cad Packages
Because Each CadPackage is it's own beast we opted to use Docker in order to give us lots of flexibility for setting up the environment to run there packages. The containers are run using AWS's container lambda and deployed using the serverless framework (JSCAD is an exception since it runs client-side). See [our docs](https://learn.cadhub.xyz/docs/general-cadhub/integrations) for more information of how this is setup.
## Getting your dev environment setup
Clone the repo and `cd` in the app directory (the docs directory is for [learn.cadhub](https://learn.cadhub.xyz/))
```
cd app
```
Install dependencies
```terminal
yarn install
```
Setting up the db, you'll need to have a postgres installed locally, you can [follow this guide](https://redwoodjs.com/docs/local-postgres-setup).
Run the following
``` terminal
yarn rw prisma migrate dev
yarn rw prisma db seed
```
p.s. `yarn rw prisma studio` spins up an app to inspect the db
### Fire up dev
```terminal
yarn rw dev
```
Your browser should open automatically to `http://localhost:8910` to see the web app. Lambda functions run on `http://localhost:8911` and are also proxied to `http://localhost:8910/.redwood/functions/*`.
If you want to access the websight on your phone use `yarn redwood dev --fwd="--host <ip-address-on-your-network-i.e.-192.168.0.5>"`
you can sign in to the following accounts locally
localUser1@kurthutten.com: `abc123`
localUser2@kurthutten.com: `abc123`
localAdmin@kurthutten.com: `abc123`
## Designs
In progress, though can be [seen on Figma](https://www.figma.com/file/VUh53RdncjZ7NuFYj0RGB9/CadHub?node-id=0%3A1)
## Docs
Docs are hosted at [learn.cadhub.xyz](http://learn.cadhub.xyz/). It includes a OpenSCAD tutorial at this point, and more is coming. The docs can be found in this repo at [docs](https://github.com/Irev-Dev/cadhub/tree/main/docs)

View File

@@ -1,17 +1,78 @@
![Screen Recording 2021-09-21 at 8](https://user-images.githubusercontent.com/29681384/134154332-65491787-7b36-4ad9-ba7a-bac0f2874051.gif)
![scrch2](https://user-images.githubusercontent.com/29681384/134156021-6b55c301-a77a-4851-b67b-b656875123e5.jpg)
![CadHub banner](https://raw.githubusercontent.com/Irev-Dev/repo-images/main/images/gear%20donutbanner.png)
# [C a d H u b](https://cadhub.xyz)
<!-- [![Netlify Status](https://api.netlify.com/api/v1/badges/77f37543-e54a-4723-8136-157c0221ec27/deploy-status)](https://app.netlify.com/sites/cadhubxyz/deploys) -->
[![Netlify Status](https://api.netlify.com/api/v1/badges/77f37543-e54a-4723-8136-157c0221ec27/deploy-status)](https://app.netlify.com/sites/cadhubxyz/deploys)
Let's help Code-CAD reach its [full potential!](https://cadhub.xyz) We're making a ~~cad~~hub for the Code-CAD community, think of it as model-repository crossed with a live editor. We have integrations in progress for [OpenSCAD](https://cadhub.xyz/dev-ide/openscad) and [CadQuery](https://cadhub.xyz/dev-ide/cadquery) with [more coming soon](https://github.com/Irev-Dev/curated-code-cad).
Let's help Code-CAD reach its [full potential!](https://cadhub.xyz) We're making a ~~cad~~hub for the Code-CAD community, think of it as model-repository crossed with a live editor. We have an integration with the excellent [cascadeStudio](https://zalo.github.io/CascadeStudio/) with [more coming soon](https://github.com/Irev-Dev/curated-code-cad).
If you want to be involved in anyway, checkout the [contributing.md](https://github.com/Irev-Dev/cadhub/blob/main/CONTRIBUTING.md).
If you want to be involved in anyway, checkout the [Road Map](https://github.com/Irev-Dev/cadhub/discussions/212) and get in touch via, [twitter](https://twitter.com/IrevDev), [discord](https://discord.gg/SD7zFRNjGH) or [discussions](https://github.com/Irev-Dev/cadhub/discussions).
you might also be interested in the [Road Map](https://github.com/Irev-Dev/cadhub/discussions/212) and getting in touch via, [twitter](https://twitter.com/IrevDev), [discord](https://discord.gg/SD7zFRNjGH) or [discussions](https://github.com/Irev-Dev/cadhub/discussions).
<img src="https://raw.githubusercontent.com/Irev-Dev/repo-images/main/images/fullcadhubshot.jpg">
## Who is CadHub
<img src="https://raw.githubusercontent.com/Irev-Dev/repo-images/main/images/Part%20IDE%20-%20export%20expand%20state.jpg">
[Kurt](https://github.com/Irev-Dev) and [Frank](https://github.com/franknoirot) make up the Core-team and [Jeremy](https://github.com/jmwright), [Torsten](https://github.com/t-paul) and [Hrg](https://github.com/hrgdavor) are a major contributors. Plus a number smaller contributors.
## Getting Started
Because we're integrating cascadeStudio, this is done some what crudely for the time being, so you'll need to clone the repo with submodules.
```terminal
git clone --recurse-submodules -j8 git@github.com:Irev-Dev/cadhub.git
# or
git clone --recurse-submodules -j8 https://github.com/Irev-Dev/cadhub.git
```
Install dependencies
```terminal
yarn install
```
Setting up the db, you'll need to have a postgres installed locally, you can [follow this guide](https://redwoodjs.com/docs/local-postgres-setup) with a couple of exceptions:
- Run `yarn rw prisma migrate dev` instead of `yarn rw db up` in the guide.
- Don't worry about changing the `schema.prisma` file.
- You will need to add a `DATABASE_URL` and test url to you `.env` file at the root of the project.
Run the following
``` terminal
yarn rw prisma migrate dev
yarn rw prisma db seed
```
### Fire up dev
```terminal
yarn rw dev
```
Your browser should open automatically to `http://localhost:8910` to see the web app. Lambda functions run on `http://localhost:8911` and are also proxied to `http://localhost:8910/.redwood/functions/*`.
you can sign in to the following accounts locally
localUser1@kurthutten.com: `abc123`
localUser2@kurthutten.com: `abc123`
localAdmin@kurthutten.com: `abc123`
You may need to register a account depending on what issue you are trying to tackle, This can be done by clicking the login button on the top right. This will open up netlify's idenitiy modal asking for the websites url, since it will notice you developing locally. Enter `https://cadhub.xyz/` than use you email, verify your email and you should be set.
(some routes are protected, but permissions is a big area that needs a lot of work in the near future, so it's in a very incomplete state atm)
### Note:
We're using [RedwoodJS](https://redwoodjs.com/), this is perhaps unwise since they haven't reached 1.0 yet, however with their aim to release 1.0 by the end of the year, it shouldn't be too difficult to port changes over the coming months.
If you not familiar with Redwood, never fear the main bit of tech it uses is React, Graphql(apollo) and serverless/lamdas, depending on what part of the app you want to help with, so long as you know you way around these bits of tech you should be fine with some light referencing of the RedWood docs
### Extra Redwood docs, i.e. getting familiar with the frame work.
- [Tutorial](https://redwoodjs.com/tutorial/welcome-to-redwood): getting started and complete overview guide.
- [Docs](https://redwoodjs.com/docs/introduction): using the Redwood Router, handling assets and files, list of command-line tools, and more.
- [Redwood Community](https://community.redwoodjs.com): get help, share tips and tricks, and collaborate on everything about RedwoodJS.
## Styles
We're using tailwind utility classes so please try and use them as much as possible. Again if you not familiar, the [tailwind search](https://tailwindcss.com/) is fantastic, so searching for the css property you want to use will lead you to the correct class 99% of the time.
## Designs
In progress, though can be [seen on Figma](https://www.figma.com/file/VUh53RdncjZ7NuFYj0RGB9/CadHub?node-id=0%3A1)
<img src="https://raw.githubusercontent.com/Irev-Dev/repo-images/main/images/Part%20Page(1).jpg">
<img src="https://raw.githubusercontent.com/Irev-Dev/repo-images/main/images/User%20Page%20Edit.jpg">

View File

@@ -18,13 +18,8 @@ CLOUDINARY_API_KEY=476712943135152
# trace | info | debug | warn | error | silent
# LOG_LEVEL=debug
CAD_LAMBDA_BASE_URL="https://wzab9s632b.execute-api.us-east-1.amazonaws.com/prod"
# EMAIL_PASSWORD=abc123
# CAD_LAMBDA_BASE_URL="http://localhost:8080"
# sentry
GITHUB_ASSIST_APP_ID=23342
GITHUB_ASSIST_SECRET=abc
GITHUB_ASSIST_PRIVATE_KEY="-----BEGIN RSA PRIVATE KEY-----\nabcdefg\n-----END RSA PRIVATE KEY-----"
# Github assist app
# GITHUB_ASSIST_APP_ID=""
# GITHUB_ASSIST_SECRET=""

View File

@@ -1 +1 @@
/web/src/cascade/*

7
app/.gitignore vendored
View File

@@ -1,7 +1,2 @@
dist
web/types/graphql.d.ts
api/types/graphql.d.ts
# Deployment
.serverless
*.pem

View File

@@ -22,7 +22,9 @@
"Uploader",
"describedby",
"initialise",
"octokit",
"redwoodjs",
"repos",
"resizer",
"roboto",
"ropa"

1
app/api/.babelrc.js Normal file
View File

@@ -0,0 +1 @@
module.exports = { extends: "../babel.config.js" }

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +0,0 @@
-- CreateTable
CREATE TABLE "RW_DataMigration" (
"version" TEXT NOT NULL,
"name" TEXT NOT NULL,
"startedAt" TIMESTAMP(3) NOT NULL,
"finishedAt" TIMESTAMP(3) NOT NULL,
PRIMARY KEY ("version")
);

View File

@@ -1,31 +0,0 @@
/*
Warnings:
- You are about to drop the `Comment` table. If the table is not empty, all the data it contains will be lost.
- You are about to drop the `Part` table. If the table is not empty, all the data it contains will be lost.
- You are about to drop the `PartReaction` table. If the table is not empty, all the data it contains will be lost.
*/
-- DropForeignKey
ALTER TABLE "Comment" DROP CONSTRAINT "Comment_partId_fkey";
-- DropForeignKey
ALTER TABLE "Comment" DROP CONSTRAINT "Comment_userId_fkey";
-- DropForeignKey
ALTER TABLE "Part" DROP CONSTRAINT "Part_userId_fkey";
-- DropForeignKey
ALTER TABLE "PartReaction" DROP CONSTRAINT "PartReaction_partId_fkey";
-- DropForeignKey
ALTER TABLE "PartReaction" DROP CONSTRAINT "PartReaction_userId_fkey";
-- DropTable
DROP TABLE "Comment";
-- DropTable
DROP TABLE "Part";
-- DropTable
DROP TABLE "PartReaction";

View File

@@ -1,59 +0,0 @@
-- CreateTable
CREATE TABLE "Project" (
"id" TEXT NOT NULL,
"title" VARCHAR(25) NOT NULL,
"description" TEXT,
"code" TEXT,
"mainImage" TEXT,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"userId" TEXT NOT NULL,
"deleted" BOOLEAN NOT NULL DEFAULT false,
PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "ProjectReaction" (
"id" TEXT NOT NULL,
"emote" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"projectId" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "Comment" (
"id" TEXT NOT NULL,
"text" TEXT NOT NULL,
"userId" TEXT NOT NULL,
"projectId" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "Project.title_userId_unique" ON "Project"("title", "userId");
-- CreateIndex
CREATE UNIQUE INDEX "ProjectReaction.emote_userId_projectId_unique" ON "ProjectReaction"("emote", "userId", "projectId");
-- AddForeignKey
ALTER TABLE "Project" ADD FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "ProjectReaction" ADD FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "ProjectReaction" ADD FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "Comment" ADD FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "Comment" ADD FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -1,5 +0,0 @@
-- CreateEnum
CREATE TYPE "CadPackage" AS ENUM ('openscad', 'cadquery');
-- AlterTable
ALTER TABLE "Project" ADD COLUMN "cadPackage" "CadPackage" NOT NULL DEFAULT E'openscad';

View File

@@ -1,17 +0,0 @@
-- CreateTable
CREATE TABLE "SocialCard" (
"id" TEXT NOT NULL,
"projectId" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"url" TEXT,
"outOfDate" BOOLEAN NOT NULL DEFAULT true,
PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "SocialCard_projectId_unique" ON "SocialCard"("projectId");
-- AddForeignKey
ALTER TABLE "SocialCard" ADD FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -1,2 +0,0 @@
-- AlterIndex
ALTER INDEX "SocialCard_projectId_unique" RENAME TO "SocialCard.projectId_unique";

View File

@@ -1,2 +0,0 @@
-- AlterEnum
ALTER TYPE "CadPackage" ADD VALUE 'jscad';

View File

@@ -1,5 +0,0 @@
-- AlterTable
ALTER TABLE "Project" ADD COLUMN "forkedFromId" TEXT;
-- AddForeignKey
ALTER TABLE "Project" ADD FOREIGN KEY ("forkedFromId") REFERENCES "Project"("id") ON DELETE SET NULL ON UPDATE CASCADE;

View File

@@ -1,56 +0,0 @@
-- DropForeignKey
ALTER TABLE "Comment" DROP CONSTRAINT "Comment_projectId_fkey";
-- DropForeignKey
ALTER TABLE "Comment" DROP CONSTRAINT "Comment_userId_fkey";
-- DropForeignKey
ALTER TABLE "Project" DROP CONSTRAINT "Project_userId_fkey";
-- DropForeignKey
ALTER TABLE "ProjectReaction" DROP CONSTRAINT "ProjectReaction_projectId_fkey";
-- DropForeignKey
ALTER TABLE "ProjectReaction" DROP CONSTRAINT "ProjectReaction_userId_fkey";
-- DropForeignKey
ALTER TABLE "SocialCard" DROP CONSTRAINT "SocialCard_projectId_fkey";
-- DropForeignKey
ALTER TABLE "SubjectAccessRequest" DROP CONSTRAINT "SubjectAccessRequest_userId_fkey";
-- AddForeignKey
ALTER TABLE "Project" ADD CONSTRAINT "Project_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "SocialCard" ADD CONSTRAINT "SocialCard_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "ProjectReaction" ADD CONSTRAINT "ProjectReaction_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "ProjectReaction" ADD CONSTRAINT "ProjectReaction_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "Comment" ADD CONSTRAINT "Comment_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "Comment" ADD CONSTRAINT "Comment_projectId_fkey" FOREIGN KEY ("projectId") REFERENCES "Project"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "SubjectAccessRequest" ADD CONSTRAINT "SubjectAccessRequest_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- RenameIndex
ALTER INDEX "Project.title_userId_unique" RENAME TO "Project_title_userId_key";
-- RenameIndex
ALTER INDEX "ProjectReaction.emote_userId_projectId_unique" RENAME TO "ProjectReaction_emote_userId_projectId_key";
-- RenameIndex
ALTER INDEX "SocialCard.projectId_unique" RENAME TO "SocialCard_projectId_key";
-- RenameIndex
ALTER INDEX "User.email_unique" RENAME TO "User_email_key";
-- RenameIndex
ALTER INDEX "User.userName_unique" RENAME TO "User_userName_key";

View File

@@ -1,3 +1,2 @@
# Please do not edit this file manually
# It should be added in your version-control system (i.e. Git)
provider = "postgresql"

View File

@@ -1,11 +1,11 @@
datasource db {
datasource DS {
provider = "postgresql"
url = env("DATABASE_URL")
}
generator client {
provider = "prisma-client-js"
binaryTargets = ["native", "rhel-openssl-1.0.x"]
binaryTargets = "native"
}
// sqlLight does not suport enums so we can't use enums until we set up postgresql in dev mode
@@ -14,6 +14,11 @@ generator client {
// ADMIN
// }
// enum PartType {
// CASCADESTUDIO
// JSCAD
// }
model User {
id String @id @default(uuid())
userName String @unique // reffered to as userId in @relations
@@ -28,21 +33,15 @@ model User {
image String? // url maybe id or file storage service? cloudinary?
bio String? //mark down
Project Project[]
Reaction ProjectReaction[]
Part Part[]
Reaction PartReaction[]
Comment Comment[]
SubjectAccessRequest SubjectAccessRequest[]
}
enum CadPackage {
openscad
cadquery
jscad // TODO #422, add jscad to db schema when were ready to enable saving of jscad projects
}
model Project {
model Part {
id String @id @default(uuid())
title String @db.VarChar(25)
title String
description String? // markdown string
code String?
mainImage String? // link to cloudinary
@@ -51,39 +50,23 @@ model Project {
user User @relation(fields: [userId], references: [id])
userId String
deleted Boolean @default(false)
cadPackage CadPackage @default(openscad)
socialCard SocialCard?
forkedFromId String?
forkedFrom Project? @relation("Fork", fields: [forkedFromId], references: [id])
childForks Project[] @relation("Fork")
Comment Comment[]
Reaction ProjectReaction[]
Comment Comment[]
Reaction PartReaction[]
@@unique([title, userId])
}
model SocialCard {
id String @id @default(uuid())
projectId String @unique
project Project @relation(fields: [projectId], references: [id])
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
url String? // link to cloudinary
outOfDate Boolean @default(true)
}
model ProjectReaction {
model PartReaction {
id String @id @default(uuid())
emote String // an emoji
user User @relation(fields: [userId], references: [id])
userId String
project Project @relation(fields: [projectId], references: [id])
projectId String
part Part @relation(fields: [partId], references: [id])
partId String
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@unique([emote, userId, projectId])
@@unique([emote, userId, partId])
}
model Comment {
@@ -91,8 +74,8 @@ model Comment {
text String // the comment, should I allow mark down?
user User @relation(fields: [userId], references: [id])
userId String
project Project @relation(fields: [projectId], references: [id])
projectId String
part Part @relation(fields: [partId], references: [id])
partId String
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
@@ -108,10 +91,3 @@ model SubjectAccessRequest {
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
}
model RW_DataMigration {
version String @id
name String
startedAt DateTime
finishedAt DateTime
}

118
app/api/db/seed.js Normal file
View File

@@ -0,0 +1,118 @@
/* eslint-disable no-console */
const { PrismaClient } = require('@prisma/client')
const dotenv = require('dotenv')
dotenv.config()
const db = new PrismaClient()
async function main() {
// Seed data is database data that needs to exist for your app to run.
// Ideally this file should be idempotent: running it multiple times
// will result in the same database state (usually by checking for the
// existence of a record before trying to create it). For example:
//
// const existing = await db.user.findMany({ where: { email: 'admin@email.com' }})
// if (!existing.length) {
// await db.user.create({ data: { name: 'Admin', email: 'admin@email.com' }})
// }
const users = [
{
id: "a2b21ce1-ae57-43a2-b6a3-b6e542fd9e60",
userName: "local-user-1",
name: "local 1",
email: "localUser1@kurthutten.com"
},
{
id: "682ba807-d10e-4caf-bf28-74054e46c9ec",
userName: "local-user-2",
name: "local 2",
email: "localUser2@kurthutten.com"
},
{
id: "5cea3906-1e8e-4673-8f0d-89e6a963c096",
userName: "local-admin-2",
name: "local admin",
email: "localAdmin@kurthutten.com"
},
]
let existing
existing = await db.user.findMany({ where: { id: users[0].id }})
if(!existing.length) {
await db.user.create({
data: users[0],
})
}
existing = await db.user.findMany({ where: { id: users[1].id }})
if(!existing.length) {
await db.user.create({
data: users[1],
})
}
const parts = [
{
title: 'demo-part1',
description: '# can be markdown',
mainImage: 'CadHub/kjdlgjnu0xmwksia7xox',
user: {
connect: {
id: users[0].id,
},
},
},
{
title: 'demo-part2',
description: '## [hey](www.google.com)',
user: {
connect: {
id: users[1].id,
},
},
},
]
existing = await db.part.findMany({where: { title: parts[0].title}})
if(!existing.length) {
await db.part.create({
data: parts[0],
})
}
existing = await db.part.findMany({where: { title: parts[1].title}})
if(!existing.length) {
await db.part.create({
data: parts[1],
})
}
const aPart = await db.part.findUnique({where: {
title_userId: {
title: parts[0].title,
userId: users[0].id,
}
}})
await db.comment.create({
data: {
text: "nice part, I like it",
user: {connect: { id: users[0].id}},
part: {connect: { id: aPart.id}},
}
})
await db.partReaction.create({
data: {
emote: "❤️",
user: {connect: { id: users[0].id}},
part: {connect: { id: aPart.id}},
}
})
console.info('No data to seed. See api/prisma/seeds.js for info.')
}
main()
.catch((e) => console.error(e))
.finally(async () => {
await db.$disconnect()
})

View File

@@ -1 +1,6 @@
module.exports = require('@redwoodjs/testing/config/jest/api')
const { getConfig } = require('@redwoodjs/core')
const config = getConfig({ type: 'jest', target: 'node' })
config.displayName.name = 'api'
module.exports = config

View File

@@ -3,25 +3,10 @@
"version": "0.0.0",
"private": true,
"dependencies": {
"@redwoodjs/api": "^0.38.1",
"@redwoodjs/graphql-server": "^0.38.1",
"@sentry/node": "^6.5.1",
"axios": "^0.21.1",
"cloudinary": "^1.23.0",
"cors": "^2.8.5",
"express": "^4.17.1",
"human-id": "^2.0.1",
"middy": "^0.36.0",
"nanoid": "^3.1.20",
"nodemailer": "^6.6.2",
"serverless-binary-cors": "^0.0.1"
},
"devDependencies": {
"@netlify/zip-it-and-ship-it": "^4.30.0",
"@types/nodemailer": "^6.4.2",
"concurrently": "^6.0.0",
"nodemon": "^2.0.7",
"serverless-dotenv-plugin": "^3.10.0",
"serverless-plugin-git-variables": "^5.1.0"
"@octokit/app": "^12.0.2",
"@octokit/webhooks-types": "^3.71.1",
"@redwoodjs/api": "^0.31.0",
"@redwoodjs/api-server": "^0.31.0",
"cloudinary": "^1.23.0"
}
}

View File

@@ -1,18 +0,0 @@
import { mockRedwoodDirective, getDirectiveName } from '@redwoodjs/testing/api'
import requireAuth from './requireAuth'
describe('requireAuth directive', () => {
it('declares the directive sdl as schema, with the correct name', () => {
expect(requireAuth.schema).toBeTruthy()
expect(getDirectiveName(requireAuth.schema)).toBe('requireAuth')
})
it('requireAuth has stub implementation. Should not throw when current user', () => {
// If you want to set values in context, pass it through e.g.
// mockRedwoodDirective(requireAuth, { context: { currentUser: { id: 1, name: 'Lebron McGretzky' } }})
const mockExecution = mockRedwoodDirective(requireAuth, { context: {} })
expect(mockExecution).not.toThrowError()
})
})

View File

@@ -1,22 +0,0 @@
import gql from 'graphql-tag'
import { createValidatorDirective } from '@redwoodjs/graphql-server'
import { requireAuth as applicationRequireAuth } from 'src/lib/auth'
export const schema = gql`
"""
Use to check whether or not a user is authenticated and is associated
with an optional set of roles.
"""
directive @requireAuth(roles: [String]) on FIELD_DEFINITION
`
const validate = ({ directiveArgs }) => {
const { roles } = directiveArgs
applicationRequireAuth({ roles })
}
const requireAuth = createValidatorDirective(schema, validate)
export default requireAuth

View File

@@ -1,10 +0,0 @@
import { getDirectiveName } from '@redwoodjs/testing/api'
import skipAuth from './skipAuth'
describe('skipAuth directive', () => {
it('declares the directive sdl as schema, with the correct name', () => {
expect(skipAuth.schema).toBeTruthy()
expect(getDirectiveName(skipAuth.schema)).toBe('skipAuth')
})
})

View File

@@ -1,16 +0,0 @@
import gql from 'graphql-tag'
import { createValidatorDirective } from '@redwoodjs/graphql-server'
export const schema = gql`
"""
Use to skip authentication checks and allow public access.
"""
directive @skipAuth on FIELD_DEFINITION
`
const skipAuth = createValidatorDirective(schema, () => {
return
})
export default skipAuth

View File

@@ -1,5 +0,0 @@
# The following are the env vars you need run the cad lamdas locally
# The still connect to s3 so some secrets are needed, ask Kurt and he'll set things up for you
DEV_AWS_SECRET_ACCESS_KEY=""
DEV_AWS_ACCESS_KEY_ID=""
DEV_BUCKET="cad-preview-bucket-dev-001"

View File

@@ -1,12 +1,11 @@
# Serverless
We're using the serverless framework for deployment
We're using the serverless from work for deployment
```
yarn rw build api && sls deploy --stage <stagename>
sls deploy
```
But [Kurt Hutten](https://github.com/Irev-Dev) is the only one with credentials for deployment atm, though if you wanted to set your own account you could deploy to that if you wanted to test.
Deploying has `yarn rw build` first because the image uses built js files
## Testing changes locally
@@ -15,21 +14,20 @@ You'll need to have Docker installed
Because of the way the docker containers to be deployed as lambdas on aws are somewhat specialised for the purpose we're using `docker-compose` to spin one up for each function/endpoint. So we've added a aws-emulation layer
The docker build relies on a git ignored file, the aws-lambda-rie. [Download it](https://github.com/aws/aws-lambda-runtime-interface-emulator/releases/download/v1.0/aws-lambda-rie), then put it into `app/api/src/docker/common/`. Alternatively you can put this download into the DockerFiles by reading the instructions at around line 29 of the DockerFiles (`app/api/src/docker/openscad/Dockerfile` & `app/api/src/docker/cadquery/Dockerfile`). However this will mean slower build times as it will need download this 14mb file every build.
Run
Then cd into this folder `cd api/src/docker` and:
```bash
yarn cad
docker-compose up --build
```
The first time you run this, it has to build the main image it will take some time, but launching again will be quicker.
After which we'll also spin up a light express server to act as an emulator to transform some the request from the front end into how the lambda's expect them (This emulates the aws-api-gateway which changes tranforms the inbound requests somewhat).
```
yarn aws-emulate
yarn install
yarn emulate
```
You can now add CAD_LAMBDA_BASE_URL="http://localhost:8080" to you .env file and restart your main dev process (`yarn rw dev`) comment that line out if you want to go back to using the aws endpoint (and restart the dev process).
You can now add OPENSCAD_BASE_URL="http://localhost:8080" to you .env file and restart your main dev process (`yarn rw dev`)
comment that line out if you want to go back to using the aws endpoint (and restart the dev process).
If you change anything in the `api/src/docker/openscad` directory, you will need to stop the docker process and restart it (will be fairly quick if you're only changing the js)

View File

@@ -1,6 +1,7 @@
const express = require('express')
var cors = require('cors')
const axios = require('axios')
const { restart } = require('nodemon')
const app = express()
const port = 8080
app.use(express.json())
@@ -9,32 +10,31 @@ app.use(cors())
const invocationURL = (port) =>
`http://localhost:${port}/2015-03-31/functions/function/invocations`
const makeRequest = (route, port) => [
route,
async (req, res) => {
console.log(`making post request to ${port}, ${route}`)
try {
const { data } = await axios.post(invocationURL(port), {
body: Buffer.from(JSON.stringify(req.body)).toString('base64'),
})
res.status(data.statusCode)
res.setHeader('Content-Type', 'application/javascript')
if (data.headers && data.headers['Content-Encoding'] === 'gzip') {
res.setHeader('Content-Encoding', 'gzip')
res.send(Buffer.from(data.body, 'base64'))
} else {
res.send(Buffer.from(data.body, 'base64'))
}
} catch (e) {
res.status(500)
res.send()
}
},
]
app.post(...makeRequest('/openscad/preview', 5052))
app.post(...makeRequest('/openscad/stl', 5053))
app.post(...makeRequest('/cadquery/stl', 5060))
app.post('/openscad/preview', async (req, res) => {
try {
const { data } = await axios.post(invocationURL(5052), {
body: Buffer.from(JSON.stringify(req.body)).toString('base64'),
})
res.status(data.statusCode)
res.send(data.body)
} catch (e) {
res.status(500)
res.send()
}
})
app.post('/cadquery/stl', async (req, res) => {
console.log('making post request to 5060')
try {
const { data } = await axios.post(invocationURL(5060), {
body: Buffer.from(JSON.stringify(req.body)).toString('base64'),
})
res.status(data.statusCode)
res.send(data.body)
} catch (e) {
res.status(500)
res.send()
}
})
app.listen(port, () => {
console.log(`Example app listening at http://localhost:${port}`)

View File

@@ -1,15 +1,14 @@
FROM continuumio/miniconda3
ENV PATH="/root/miniconda3/bin:${PATH}"
ARG PATH="/root/miniconda3/bin:${PATH}"
FROM public.ecr.aws/lts/ubuntu:20.04_stable
ARG DEBIAN_FRONTEND=noninteractive
RUN apt-get update --fix-missing -qq
RUN apt-get update -qq
RUN apt-get -y -qq install software-properties-common dirmngr apt-transport-https lsb-release ca-certificates xvfb
RUN apt-get update -qq
RUN apt-get install -y wget
# install node14, see comment at the to of node14source_setup.sh
ADD api/src/docker/common/node14source_setup.sh /nodesource_setup.sh
ADD common/node14source_setup.sh /nodesource_setup.sh
RUN ["chmod", "+x", "/nodesource_setup.sh"]
RUN bash nodesource_setup.sh
RUN apt-get install -y nodejs
@@ -22,45 +21,29 @@ RUN apt-get update && \
cmake \
unzip \
automake autoconf libtool \
libcurl4-openssl-dev \
curl \
git
libcurl4-openssl-dev
# Add the lambda emulator for local dev, (see entrypoint.sh for where it's used),
# I have the file locally (gitignored) to speed up build times (as it downloads everytime),
# but you can use the http version of the below ADD command or download it yourself from that url.
ADD api/src/docker/common/aws-lambda-rie /usr/local/bin/aws-lambda-rie
ADD common/aws-lambda-rie /usr/local/bin/aws-lambda-rie
# ADD https://github.com/aws/aws-lambda-runtime-interface-emulator/releases/download/v1.0/aws-lambda-rie /usr/local/bin/aws-lambda-rie
RUN ["chmod", "+x", "/usr/local/bin/aws-lambda-rie"]
WORKDIR /var/task/
# aws-lambda-ric does not play nice with yarn, so installing it seperately,
# circle back to this later for a proper solution
COPY package*.json /var/task/
RUN npm install aws-lambda-ric@1.0.0
COPY cadquery/package*.json /var/task/
RUN npm install
RUN conda --version
# Install CadQuery
RUN conda install -c cadquery -c conda-forge cadquery=master ocp=7.5.2 python=3.8
RUN conda info
# Get a copy of cq-cli from GitHub
RUN git clone https://github.com/CadQuery/cq-cli.git
# Get the distribution copy of cq-cli
RUN apt-get install -y libglew2.1
RUN wget https://github.com/CadQuery/cq-cli/releases/download/v2.1.0/cq-cli-Linux-x86_64.zip
RUN unzip cq-cli-Linux-x86_64.zip
RUN echo "cadhub-concat-split" > /var/task/cadhub-concat-split
# using built javascript from dist
# run `yarn rw build` and $(npm bin)/zip-it-and-ship-it api/dist/functions/ api/dist/zipball before bulding this image
COPY api/dist/zipball/cadquery.zip /var/task/
# -n stops aws-lamda-ric from being overridden.
RUN unzip -n /var/task/cadquery.zip
COPY api/src/docker/common/entrypoint.sh /entrypoint.sh
RUN chmod +x cq-cli/cq-cli
COPY cadquery/*.js /var/task/
COPY common/*.js /var/common/
COPY common/entrypoint.sh /entrypoint.sh
RUN ["chmod", "+x", "/entrypoint.sh"]
ENTRYPOINT ["sh", "/entrypoint.sh"]
CMD [ "cadquery.stl" ]

View File

@@ -0,0 +1,53 @@
const { runCQ } = require('./runCQ')
const middy = require('middy')
const { cors } = require('middy/middlewares')
// cors true does not seem to work in serverless.yml, perhaps docker lambdas aren't covered by that config
// special lambda just for responding to options requests
const preflightOptions = (req, _context, callback) => {
const response = {
statusCode: 204,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'POST',
'Access-Control-Allow-Headers': '*',
},
}
callback(null, response)
}
const stl = async (req, _context, callback) => {
_context.callbackWaitsForEmptyEventLoop = false
const eventBody = Buffer.from(req.body, 'base64').toString('ascii')
console.log(eventBody, 'eventBody')
const { file, settings } = JSON.parse(eventBody)
const { error, result, tempFile } = await runCQ({ file, settings })
if (error) {
const response = {
statusCode: 400,
body: JSON.stringify({ error, tempFile }),
}
callback(null, response)
} else {
console.log(`got result in route: ${result}, file is: ${tempFile}`)
const fs = require('fs')
const image = fs.readFileSync(`/tmp/${tempFile}/output.stl`, {
encoding: 'base64',
})
console.log(image, 'encoded image')
const response = {
statusCode: 200,
body: JSON.stringify({
imageBase64: image,
result,
tempFile,
}),
}
callback(null, response)
}
}
module.exports = {
stl: middy(stl).use(cors()),
preflightOptions,
}

View File

@@ -1,21 +0,0 @@
import { runCQ } from './runCQ'
import middy from 'middy'
import { cors } from 'middy/middlewares'
import { loggerWrap, storeAssetAndReturnUrl } from '../common/utils'
const _stl = async (req, _context, callback) => {
_context.callbackWaitsForEmptyEventLoop = false
const eventBody = Buffer.from(req.body, 'base64').toString('ascii')
console.log('eventBody', eventBody)
const { file, settings } = JSON.parse(eventBody)
const { error, consoleMessage, fullPath } = await runCQ({ file, settings })
await storeAssetAndReturnUrl({
error,
callback,
fullPath,
consoleMessage,
})
}
export const stl = middy(loggerWrap(_stl)).use(cors())

View File

@@ -0,0 +1,16 @@
{
"name": "openscad-endpoint",
"version": "0.0.1",
"description": "endpoint for openscad",
"main": "index.js",
"author": "Kurt Hutten <kurt@kurthutten.com>",
"license": "",
"dependencies": {
"cors": "^2.8.5",
"middy": "^0.36.0",
"nanoid": "^3.1.20"
},
"devDependencies": {
"aws-lambda-ric": "^1.0.0"
}
}

View File

@@ -0,0 +1,15 @@
const { makeFile, runCommand } = require('../common/utils')
const { nanoid } = require('nanoid')
module.exports.runCQ = async ({ file, settings = {} } = {}) => {
const tempFile = await makeFile(file, '.py', nanoid)
const command = `cq-cli/cq-cli --codec stl --infile /tmp/${tempFile}/main.py --outfile /tmp/${tempFile}/output.stl`
console.log('command', command)
try {
const result = await runCommand(command, 30000)
return { result, tempFile }
} catch (error) {
return { error, tempFile }
}
}

View File

@@ -1,60 +0,0 @@
import { writeFiles, runCommand } from '../common/utils'
import { nanoid } from 'nanoid'
import { readFile } from 'fs/promises'
export const runCQ = async ({
file,
settings: { deflection = 0.3, parameters } = {},
} = {}) => {
const tempFile = await writeFiles(
[
{ file, fileName: 'main.py' },
{
file: JSON.stringify(parameters),
fileName: 'params.json',
},
],
'a' + nanoid() // 'a' ensure nothing funny happens if it start with a bad character like "-", maybe I should pick a safer id generator :shrug:
)
const fullPath = `/tmp/${tempFile}/output.gz`
const stlPath = `/tmp/${tempFile}/output.stl`
const customizerPath = `/tmp/${tempFile}/customizer.json`
const command = [
`/var/task/cq-cli/cq-cli.py`,
`--codec stl`,
`--infile /tmp/${tempFile}/main.py`,
`--outfile ${stlPath}`,
`--outputopts "deflection:${deflection};angularDeflection:${deflection};"`,
`--params /tmp/${tempFile}/params.json`,
`--getparams ${customizerPath}`,
].join(' ')
console.log('command', command)
let consoleMessage = ''
try {
consoleMessage = await runCommand(command, 30000)
const params = JSON.parse(
await readFile(customizerPath, { encoding: 'ascii' })
)
await writeFiles(
[
{
file: JSON.stringify({
customizerParams: params,
consoleMessage,
type: 'stl',
}),
fileName: 'metadata.json',
},
],
tempFile
)
await runCommand(
`cat ${stlPath} /var/task/cadhub-concat-split /tmp/${tempFile}/metadata.json | gzip > ${fullPath}`,
15000,
true
)
return { consoleMessage, fullPath }
} catch (error) {
return { error: consoleMessage || error, fullPath }
}
}

View File

@@ -0,0 +1,386 @@
# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.
# yarn lockfile v1
accepts@~1.3.7:
version "1.3.7"
resolved "https://registry.yarnpkg.com/accepts/-/accepts-1.3.7.tgz#531bc726517a3b2b41f850021c6cc15eaab507cd"
integrity sha512-Il80Qs2WjYlJIBNzNkK6KYqlVMTbZLXgHx2oT0pU/fjRHyEp+PEfEPY0R3WCwAGVOtauxh1hOxNgIf5bv7dQpA==
dependencies:
mime-types "~2.1.24"
negotiator "0.6.2"
array-flatten@1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/array-flatten/-/array-flatten-1.1.1.tgz#9a5f699051b1e7073328f2a008968b64ea2955d2"
integrity sha1-ml9pkFGx5wczKPKgCJaLZOopVdI=
body-parser@1.19.0:
version "1.19.0"
resolved "https://registry.yarnpkg.com/body-parser/-/body-parser-1.19.0.tgz#96b2709e57c9c4e09a6fd66a8fd979844f69f08a"
integrity sha512-dhEPs72UPbDnAQJ9ZKMNTP6ptJaionhP5cBb541nXPlW60Jepo9RV/a4fX4XWW9CuFNK22krhrj1+rgzifNCsw==
dependencies:
bytes "3.1.0"
content-type "~1.0.4"
debug "2.6.9"
depd "~1.1.2"
http-errors "1.7.2"
iconv-lite "0.4.24"
on-finished "~2.3.0"
qs "6.7.0"
raw-body "2.4.0"
type-is "~1.6.17"
bytes@3.1.0:
version "3.1.0"
resolved "https://registry.yarnpkg.com/bytes/-/bytes-3.1.0.tgz#f6cf7933a360e0588fa9fde85651cdc7f805d1f6"
integrity sha512-zauLjrfCG+xvoyaqLoV8bLVXXNGC4JqlxFCutSDWA6fJrTo2ZuvLYTqZ7aHBLZSMOopbzwv8f+wZcVzfVTI2Dg==
content-disposition@0.5.3:
version "0.5.3"
resolved "https://registry.yarnpkg.com/content-disposition/-/content-disposition-0.5.3.tgz#e130caf7e7279087c5616c2007d0485698984fbd"
integrity sha512-ExO0774ikEObIAEV9kDo50o+79VCUdEB6n6lzKgGwupcVeRlhrj3qGAfwq8G6uBJjkqLrhT0qEYFcWng8z1z0g==
dependencies:
safe-buffer "5.1.2"
content-type@~1.0.4:
version "1.0.4"
resolved "https://registry.yarnpkg.com/content-type/-/content-type-1.0.4.tgz#e138cc75e040c727b1966fe5e5f8c9aee256fe3b"
integrity sha512-hIP3EEPs8tB9AT1L+NUqtwOAps4mk2Zob89MWXMHjHWg9milF/j4osnnQLXBCBFBk/tvIG/tUc9mOUJiPBhPXA==
cookie-signature@1.0.6:
version "1.0.6"
resolved "https://registry.yarnpkg.com/cookie-signature/-/cookie-signature-1.0.6.tgz#e303a882b342cc3ee8ca513a79999734dab3ae2c"
integrity sha1-4wOogrNCzD7oylE6eZmXNNqzriw=
cookie@0.4.0:
version "0.4.0"
resolved "https://registry.yarnpkg.com/cookie/-/cookie-0.4.0.tgz#beb437e7022b3b6d49019d088665303ebe9c14ba"
integrity sha512-+Hp8fLp57wnUSt0tY0tHEXh4voZRDnoIrZPqlo3DPiI4y9lwg/jqx+1Om94/W6ZaPDOUbnjOt/99w66zk+l1Xg==
cors@^2.8.5:
version "2.8.5"
resolved "https://registry.yarnpkg.com/cors/-/cors-2.8.5.tgz#eac11da51592dd86b9f06f6e7ac293b3df875d29"
integrity sha512-KIHbLJqu73RGr/hnbrO9uBeixNGuvSQjul/jdFvS/KFSIH1hWVd1ng7zOHx+YrEfInLG7q4n6GHQ9cDtxv/P6g==
dependencies:
object-assign "^4"
vary "^1"
debug@2.6.9:
version "2.6.9"
resolved "https://registry.yarnpkg.com/debug/-/debug-2.6.9.tgz#5d128515df134ff327e90a4c93f4e077a536341f"
integrity sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==
dependencies:
ms "2.0.0"
depd@~1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/depd/-/depd-1.1.2.tgz#9bcd52e14c097763e749b274c4346ed2e560b5a9"
integrity sha1-m81S4UwJd2PnSbJ0xDRu0uVgtak=
destroy@~1.0.4:
version "1.0.4"
resolved "https://registry.yarnpkg.com/destroy/-/destroy-1.0.4.tgz#978857442c44749e4206613e37946205826abd80"
integrity sha1-l4hXRCxEdJ5CBmE+N5RiBYJqvYA=
ee-first@1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/ee-first/-/ee-first-1.1.1.tgz#590c61156b0ae2f4f0255732a158b266bc56b21d"
integrity sha1-WQxhFWsK4vTwJVcyoViyZrxWsh0=
encodeurl@~1.0.2:
version "1.0.2"
resolved "https://registry.yarnpkg.com/encodeurl/-/encodeurl-1.0.2.tgz#ad3ff4c86ec2d029322f5a02c3a9a606c95b3f59"
integrity sha1-rT/0yG7C0CkyL1oCw6mmBslbP1k=
escape-html@~1.0.3:
version "1.0.3"
resolved "https://registry.yarnpkg.com/escape-html/-/escape-html-1.0.3.tgz#0258eae4d3d0c0974de1c169188ef0051d1d1988"
integrity sha1-Aljq5NPQwJdN4cFpGI7wBR0dGYg=
etag@~1.8.1:
version "1.8.1"
resolved "https://registry.yarnpkg.com/etag/-/etag-1.8.1.tgz#41ae2eeb65efa62268aebfea83ac7d79299b0887"
integrity sha1-Qa4u62XvpiJorr/qg6x9eSmbCIc=
express@^4.17.1:
version "4.17.1"
resolved "https://registry.yarnpkg.com/express/-/express-4.17.1.tgz#4491fc38605cf51f8629d39c2b5d026f98a4c134"
integrity sha512-mHJ9O79RqluphRrcw2X/GTh3k9tVv8YcoyY4Kkh4WDMUYKRZUq0h1o0w2rrrxBqM7VoeUVqgb27xlEMXTnYt4g==
dependencies:
accepts "~1.3.7"
array-flatten "1.1.1"
body-parser "1.19.0"
content-disposition "0.5.3"
content-type "~1.0.4"
cookie "0.4.0"
cookie-signature "1.0.6"
debug "2.6.9"
depd "~1.1.2"
encodeurl "~1.0.2"
escape-html "~1.0.3"
etag "~1.8.1"
finalhandler "~1.1.2"
fresh "0.5.2"
merge-descriptors "1.0.1"
methods "~1.1.2"
on-finished "~2.3.0"
parseurl "~1.3.3"
path-to-regexp "0.1.7"
proxy-addr "~2.0.5"
qs "6.7.0"
range-parser "~1.2.1"
safe-buffer "5.1.2"
send "0.17.1"
serve-static "1.14.1"
setprototypeof "1.1.1"
statuses "~1.5.0"
type-is "~1.6.18"
utils-merge "1.0.1"
vary "~1.1.2"
finalhandler@~1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/finalhandler/-/finalhandler-1.1.2.tgz#b7e7d000ffd11938d0fdb053506f6ebabe9f587d"
integrity sha512-aAWcW57uxVNrQZqFXjITpW3sIUQmHGG3qSb9mUah9MgMC4NeWhNOlNjXEYq3HjRAvL6arUviZGGJsBg6z0zsWA==
dependencies:
debug "2.6.9"
encodeurl "~1.0.2"
escape-html "~1.0.3"
on-finished "~2.3.0"
parseurl "~1.3.3"
statuses "~1.5.0"
unpipe "~1.0.0"
forwarded@~0.1.2:
version "0.1.2"
resolved "https://registry.yarnpkg.com/forwarded/-/forwarded-0.1.2.tgz#98c23dab1175657b8c0573e8ceccd91b0ff18c84"
integrity sha1-mMI9qxF1ZXuMBXPozszZGw/xjIQ=
fresh@0.5.2:
version "0.5.2"
resolved "https://registry.yarnpkg.com/fresh/-/fresh-0.5.2.tgz#3d8cadd90d976569fa835ab1f8e4b23a105605a7"
integrity sha1-PYyt2Q2XZWn6g1qx+OSyOhBWBac=
http-errors@1.7.2:
version "1.7.2"
resolved "https://registry.yarnpkg.com/http-errors/-/http-errors-1.7.2.tgz#4f5029cf13239f31036e5b2e55292bcfbcc85c8f"
integrity sha512-uUQBt3H/cSIVfch6i1EuPNy/YsRSOUBXTVfZ+yR7Zjez3qjBz6i9+i4zjNaoqcoFVI4lQJ5plg63TvGfRSDCRg==
dependencies:
depd "~1.1.2"
inherits "2.0.3"
setprototypeof "1.1.1"
statuses ">= 1.5.0 < 2"
toidentifier "1.0.0"
http-errors@~1.7.2:
version "1.7.3"
resolved "https://registry.yarnpkg.com/http-errors/-/http-errors-1.7.3.tgz#6c619e4f9c60308c38519498c14fbb10aacebb06"
integrity sha512-ZTTX0MWrsQ2ZAhA1cejAwDLycFsd7I7nVtnkT3Ol0aqodaKW+0CTZDQ1uBv5whptCnc8e8HeRRJxRs0kmm/Qfw==
dependencies:
depd "~1.1.2"
inherits "2.0.4"
setprototypeof "1.1.1"
statuses ">= 1.5.0 < 2"
toidentifier "1.0.0"
iconv-lite@0.4.24:
version "0.4.24"
resolved "https://registry.yarnpkg.com/iconv-lite/-/iconv-lite-0.4.24.tgz#2022b4b25fbddc21d2f524974a474aafe733908b"
integrity sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==
dependencies:
safer-buffer ">= 2.1.2 < 3"
inherits@2.0.3:
version "2.0.3"
resolved "https://registry.yarnpkg.com/inherits/-/inherits-2.0.3.tgz#633c2c83e3da42a502f52466022480f4208261de"
integrity sha1-Yzwsg+PaQqUC9SRmAiSA9CCCYd4=
inherits@2.0.4:
version "2.0.4"
resolved "https://registry.yarnpkg.com/inherits/-/inherits-2.0.4.tgz#0fa2c64f932917c3433a0ded55363aae37416b7c"
integrity sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==
ipaddr.js@1.9.1:
version "1.9.1"
resolved "https://registry.yarnpkg.com/ipaddr.js/-/ipaddr.js-1.9.1.tgz#bff38543eeb8984825079ff3a2a8e6cbd46781b3"
integrity sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==
media-typer@0.3.0:
version "0.3.0"
resolved "https://registry.yarnpkg.com/media-typer/-/media-typer-0.3.0.tgz#8710d7af0aa626f8fffa1ce00168545263255748"
integrity sha1-hxDXrwqmJvj/+hzgAWhUUmMlV0g=
merge-descriptors@1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/merge-descriptors/-/merge-descriptors-1.0.1.tgz#b00aaa556dd8b44568150ec9d1b953f3f90cbb61"
integrity sha1-sAqqVW3YtEVoFQ7J0blT8/kMu2E=
methods@~1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/methods/-/methods-1.1.2.tgz#5529a4d67654134edcc5266656835b0f851afcee"
integrity sha1-VSmk1nZUE07cxSZmVoNbD4Ua/O4=
mime-db@1.46.0:
version "1.46.0"
resolved "https://registry.yarnpkg.com/mime-db/-/mime-db-1.46.0.tgz#6267748a7f799594de3cbc8cde91def349661cee"
integrity sha512-svXaP8UQRZ5K7or+ZmfNhg2xX3yKDMUzqadsSqi4NCH/KomcH75MAMYAGVlvXn4+b/xOPhS3I2uHKRUzvjY7BQ==
mime-types@~2.1.24:
version "2.1.29"
resolved "https://registry.yarnpkg.com/mime-types/-/mime-types-2.1.29.tgz#1d4ab77da64b91f5f72489df29236563754bb1b2"
integrity sha512-Y/jMt/S5sR9OaqteJtslsFZKWOIIqMACsJSiHghlCAyhf7jfVYjKBmLiX8OgpWeW+fjJ2b+Az69aPFPkUOY6xQ==
dependencies:
mime-db "1.46.0"
mime@1.6.0:
version "1.6.0"
resolved "https://registry.yarnpkg.com/mime/-/mime-1.6.0.tgz#32cd9e5c64553bd58d19a568af452acff04981b1"
integrity sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==
ms@2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/ms/-/ms-2.0.0.tgz#5608aeadfc00be6c2901df5f9861788de0d597c8"
integrity sha1-VgiurfwAvmwpAd9fmGF4jeDVl8g=
ms@2.1.1:
version "2.1.1"
resolved "https://registry.yarnpkg.com/ms/-/ms-2.1.1.tgz#30a5864eb3ebb0a66f2ebe6d727af06a09d86e0a"
integrity sha512-tgp+dl5cGk28utYktBsrFqA7HKgrhgPsg6Z/EfhWI4gl1Hwq8B/GmY/0oXZ6nF8hDVesS/FpnYaD/kOWhYQvyg==
nanoid@^3.1.20:
version "3.1.20"
resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.20.tgz#badc263c6b1dcf14b71efaa85f6ab4c1d6cfc788"
integrity sha512-a1cQNyczgKbLX9jwbS/+d7W8fX/RfgYR7lVWwWOGIPNgK2m0MWvrGF6/m4kk6U3QcFMnZf3RIhL0v2Jgh/0Uxw==
negotiator@0.6.2:
version "0.6.2"
resolved "https://registry.yarnpkg.com/negotiator/-/negotiator-0.6.2.tgz#feacf7ccf525a77ae9634436a64883ffeca346fb"
integrity sha512-hZXc7K2e+PgeI1eDBe/10Ard4ekbfrrqG8Ep+8Jmf4JID2bNg7NvCPOZN+kfF574pFQI7mum2AUqDidoKqcTOw==
object-assign@^4:
version "4.1.1"
resolved "https://registry.yarnpkg.com/object-assign/-/object-assign-4.1.1.tgz#2109adc7965887cfc05cbbd442cac8bfbb360863"
integrity sha1-IQmtx5ZYh8/AXLvUQsrIv7s2CGM=
on-finished@~2.3.0:
version "2.3.0"
resolved "https://registry.yarnpkg.com/on-finished/-/on-finished-2.3.0.tgz#20f1336481b083cd75337992a16971aa2d906947"
integrity sha1-IPEzZIGwg811M3mSoWlxqi2QaUc=
dependencies:
ee-first "1.1.1"
parseurl@~1.3.3:
version "1.3.3"
resolved "https://registry.yarnpkg.com/parseurl/-/parseurl-1.3.3.tgz#9da19e7bee8d12dff0513ed5b76957793bc2e8d4"
integrity sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==
path-to-regexp@0.1.7:
version "0.1.7"
resolved "https://registry.yarnpkg.com/path-to-regexp/-/path-to-regexp-0.1.7.tgz#df604178005f522f15eb4490e7247a1bfaa67f8c"
integrity sha1-32BBeABfUi8V60SQ5yR6G/qmf4w=
proxy-addr@~2.0.5:
version "2.0.6"
resolved "https://registry.yarnpkg.com/proxy-addr/-/proxy-addr-2.0.6.tgz#fdc2336505447d3f2f2c638ed272caf614bbb2bf"
integrity sha512-dh/frvCBVmSsDYzw6n926jv974gddhkFPfiN8hPOi30Wax25QZyZEGveluCgliBnqmuM+UJmBErbAUFIoDbjOw==
dependencies:
forwarded "~0.1.2"
ipaddr.js "1.9.1"
qs@6.7.0:
version "6.7.0"
resolved "https://registry.yarnpkg.com/qs/-/qs-6.7.0.tgz#41dc1a015e3d581f1621776be31afb2876a9b1bc"
integrity sha512-VCdBRNFTX1fyE7Nb6FYoURo/SPe62QCaAyzJvUjwRaIsc+NePBEniHlvxFmmX56+HZphIGtV0XeCirBtpDrTyQ==
range-parser@~1.2.1:
version "1.2.1"
resolved "https://registry.yarnpkg.com/range-parser/-/range-parser-1.2.1.tgz#3cf37023d199e1c24d1a55b84800c2f3e6468031"
integrity sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==
raw-body@2.4.0:
version "2.4.0"
resolved "https://registry.yarnpkg.com/raw-body/-/raw-body-2.4.0.tgz#a1ce6fb9c9bc356ca52e89256ab59059e13d0332"
integrity sha512-4Oz8DUIwdvoa5qMJelxipzi/iJIi40O5cGV1wNYp5hvZP8ZN0T+jiNkL0QepXs+EsQ9XJ8ipEDoiH70ySUJP3Q==
dependencies:
bytes "3.1.0"
http-errors "1.7.2"
iconv-lite "0.4.24"
unpipe "1.0.0"
safe-buffer@5.1.2:
version "5.1.2"
resolved "https://registry.yarnpkg.com/safe-buffer/-/safe-buffer-5.1.2.tgz#991ec69d296e0313747d59bdfd2b745c35f8828d"
integrity sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==
"safer-buffer@>= 2.1.2 < 3":
version "2.1.2"
resolved "https://registry.yarnpkg.com/safer-buffer/-/safer-buffer-2.1.2.tgz#44fa161b0187b9549dd84bb91802f9bd8385cd6a"
integrity sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==
send@0.17.1:
version "0.17.1"
resolved "https://registry.yarnpkg.com/send/-/send-0.17.1.tgz#c1d8b059f7900f7466dd4938bdc44e11ddb376c8"
integrity sha512-BsVKsiGcQMFwT8UxypobUKyv7irCNRHk1T0G680vk88yf6LBByGcZJOTJCrTP2xVN6yI+XjPJcNuE3V4fT9sAg==
dependencies:
debug "2.6.9"
depd "~1.1.2"
destroy "~1.0.4"
encodeurl "~1.0.2"
escape-html "~1.0.3"
etag "~1.8.1"
fresh "0.5.2"
http-errors "~1.7.2"
mime "1.6.0"
ms "2.1.1"
on-finished "~2.3.0"
range-parser "~1.2.1"
statuses "~1.5.0"
serve-static@1.14.1:
version "1.14.1"
resolved "https://registry.yarnpkg.com/serve-static/-/serve-static-1.14.1.tgz#666e636dc4f010f7ef29970a88a674320898b2f9"
integrity sha512-JMrvUwE54emCYWlTI+hGrGv5I8dEwmco/00EvkzIIsR7MqrHonbD9pO2MOfFnpFntl7ecpZs+3mW+XbQZu9QCg==
dependencies:
encodeurl "~1.0.2"
escape-html "~1.0.3"
parseurl "~1.3.3"
send "0.17.1"
setprototypeof@1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/setprototypeof/-/setprototypeof-1.1.1.tgz#7e95acb24aa92f5885e0abef5ba131330d4ae683"
integrity sha512-JvdAWfbXeIGaZ9cILp38HntZSFSo3mWg6xGcJJsd+d4aRMOqauag1C63dJfDw7OaMYwEbHMOxEZ1lqVRYP2OAw==
"statuses@>= 1.5.0 < 2", statuses@~1.5.0:
version "1.5.0"
resolved "https://registry.yarnpkg.com/statuses/-/statuses-1.5.0.tgz#161c7dac177659fd9811f43771fa99381478628c"
integrity sha1-Fhx9rBd2Wf2YEfQ3cfqZOBR4Yow=
toidentifier@1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/toidentifier/-/toidentifier-1.0.0.tgz#7e1be3470f1e77948bc43d94a3c8f4d7752ba553"
integrity sha512-yaOH/Pk/VEhBWWTlhI+qXxDFXlejDGcQipMlyxda9nthulaxLZUNcUqFxokp0vcYnvteJln5FNQDRrxj3YcbVw==
type-is@~1.6.17, type-is@~1.6.18:
version "1.6.18"
resolved "https://registry.yarnpkg.com/type-is/-/type-is-1.6.18.tgz#4e552cd05df09467dcbc4ef739de89f2cf37c131"
integrity sha512-TkRKr9sUTxEH8MdfuCSP7VizJyzRNMjj2J2do2Jr3Kym598JVdEksuzPQCnlFPW4ky9Q+iA+ma9BGm06XQBy8g==
dependencies:
media-typer "0.3.0"
mime-types "~2.1.24"
unpipe@1.0.0, unpipe@~1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/unpipe/-/unpipe-1.0.0.tgz#b2bf4ee8514aae6165b4817829d21b2ef49904ec"
integrity sha1-sr9O6FFKrmFltIF4KdIbLvSZBOw=
utils-merge@1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/utils-merge/-/utils-merge-1.0.1.tgz#9f95710f50a267947b2ccc124741c1028427e713"
integrity sha1-n5VxD1CiZ5R7LMwSR0HBAoQn5xM=
vary@^1, vary@~1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/vary/-/vary-1.1.2.tgz#2299f02c6ded30d4a5961b0b9f74524a18f634fc"
integrity sha1-IpnwLG3tMNSllhsLn3RSShj2NPw=

View File

@@ -0,0 +1,41 @@
const { exec } = require('child_process')
const { promises } = require('fs')
const { writeFile } = promises
async function makeFile(file, extension = '.scad', makeHash) {
const tempFile = 'a' + makeHash() // 'a' ensure nothing funny happens if it start with a bad character like "-", maybe I should pick a safer id generator :shrug:
console.log(`file to write: ${file}`)
await runCommand(`mkdir /tmp/${tempFile}`)
await writeFile(`/tmp/${tempFile}/main${extension}`, file)
return tempFile
}
async function runCommand(command, timeout = 5000) {
return new Promise((resolve, reject) => {
exec(command, (error, stdout, stderr) => {
if (error) {
console.log(`error: ${error.message}`)
console.log(`stderr: ${stderr}`)
console.log(`stdout: ${stdout}`)
reject(stdout || stderr) // it seems random if the message is in stdout or stderr, but not normally both
return
}
if (stderr) {
console.log(`stderr: ${stderr}`)
resolve(stderr)
return
}
console.log(`stdout: ${stdout}`)
resolve(stdout)
})
setTimeout(() => {
reject('timeout')
}, timeout)
})
}
module.exports = {
runCommand,
makeFile,
}

View File

@@ -1,151 +0,0 @@
const { exec } = require('child_process')
const { promises } = require('fs')
const { writeFile } = promises
const { createHash } = require('crypto')
import { readFile } from 'fs/promises'
export async function writeFiles(
files: { file: string; fileName: string }[] = [],
tempFile: string
): Promise<string> {
console.log(`file to write: ${files.length}`)
try {
await runCommand(`mkdir /tmp/${tempFile}`)
} catch (e) {
//
}
await Promise.all(
files.map(({ file, fileName }) =>
writeFile(`/tmp/${tempFile}/${fileName}`, file)
)
)
return tempFile
}
export async function runCommand(
command,
timeout = 5000,
shouldRejectStdErr = false
): Promise<string> {
return new Promise((resolve, reject) => {
exec(command, (error, stdout, stderr) => {
if (error) {
console.log(`error: ${error.message}`)
console.log(`stderr: ${stderr}`)
console.log(`stdout: ${stdout}`)
reject(stdout || stderr) // it seems random if the message is in stdout or stderr, but not normally both
return
}
if (stderr) {
console.log(`stderr: ${stderr}`)
if (shouldRejectStdErr) {
reject(stderr)
return
}
resolve(stderr)
return
}
console.log(`stdout: ${stdout}`)
resolve(stdout)
})
setTimeout(() => {
reject('timeout')
}, timeout)
})
}
function makeHash(script) {
return createHash('sha256').update(script).digest('hex')
}
async function checkIfAlreadyExists(params, s3) {
try {
await s3.headObject(params).promise()
return { isAlreadyInBucket: true }
} catch (e) {
console.log("couldn't find it", e)
return { isAlreadyInBucket: false }
}
}
function getObjectUrl(params, s3, tk) {
const getTruncatedTime = () => {
const currentTime = new Date()
const d = new Date(currentTime)
d.setMinutes(Math.floor(d.getMinutes() / 10) * 10)
d.setSeconds(0)
d.setMilliseconds(0)
return d
}
const HALF_HOUR = 1800
return tk.withFreeze(getTruncatedTime(), () =>
s3.getSignedUrl('getObject', {
...params,
Expires: HALF_HOUR,
})
)
}
export function loggerWrap(handler) {
return (req, _context, callback) => {
try {
return handler(req, _context, callback)
} catch (e) {
console.log('error in handler', e)
}
}
}
export async function storeAssetAndReturnUrl({
error,
callback,
fullPath,
consoleMessage,
}: {
error: string
callback: Function
fullPath: string
consoleMessage: string
}) {
if (error) {
const response = {
statusCode: 400,
body: Buffer.from(JSON.stringify({ error, fullPath })).toString('base64'),
isBase64Encoded: true,
}
callback(null, response)
return
} else {
console.log(`got result in route: ${consoleMessage}, file is: ${fullPath}`)
let buffer = ''
try {
buffer = await readFile(fullPath, { encoding: 'base64' })
} catch (e) {
console.log('read file error', e)
const response = {
statusCode: 400,
body: Buffer.from(
JSON.stringify({ error: consoleMessage, fullPath })
).toString('base64'),
isBase64Encoded: true,
}
callback(null, response)
return
}
const response = {
statusCode: 200,
body: buffer,
isBase64Encoded: true,
headers: {
'Content-Type': 'application/javascript',
'Content-Encoding': 'gzip',
},
}
callback(null, response)
return
}
}

View File

@@ -1,47 +1,44 @@
services:
# aws-emulator:
# build: .
# networks:
# - awsland
# ports:
# - "5050:8080"
openscad-health:
build:
context: ./
dockerfile: ./openscad/.
image: openscad
command: openscad.health
ports:
- "5051:8080"
openscad-preview:
build:
context: ../../../
dockerfile: ./api/src/docker/openscad/Dockerfile
image: openscad
# build: ./openscad/.
command: openscad.preview
# Adding volumes so that the containers can be restarted for js only changes in local dev
volumes:
- ../dist/docker/openscad:/var/task/js/
- ../dist/docker/common:/var/task/common/
# networks:
# - awsland
ports:
- "5052:8080"
environment:
AWS_SECRET_ACCESS_KEY: "${DEV_AWS_SECRET_ACCESS_KEY}"
AWS_ACCESS_KEY_ID: "${DEV_AWS_ACCESS_KEY_ID}"
BUCKET: "${DEV_BUCKET}"
openscad-stl:
image: openscad
volumes:
- ../dist/docker/openscad:/var/task/js/
- ../dist/docker/common:/var/task/common/
# build: ./openscad/.
command: openscad.stl
ports:
- "5053:8080"
environment:
AWS_SECRET_ACCESS_KEY: "${DEV_AWS_SECRET_ACCESS_KEY}"
AWS_ACCESS_KEY_ID: "${DEV_AWS_ACCESS_KEY_ID}"
BUCKET: "${DEV_BUCKET}"
cadquery-stl:
build:
context: ../../../
dockerfile: ./api/src/docker/cadquery/Dockerfile
volumes:
- ../dist/docker/cadquery:/var/task/js/
- ../dist/docker/common:/var/task/common/
context: ./
dockerfile: ./cadquery/.
command: cadquery.stl
ports:
- 5060:8080
environment:
AWS_SECRET_ACCESS_KEY: "${DEV_AWS_SECRET_ACCESS_KEY}"
AWS_ACCESS_KEY_ID: "${DEV_AWS_ACCESS_KEY_ID}"
BUCKET: "${DEV_BUCKET}"
# networks:
# awsland:
# name: awsland

View File

@@ -3,18 +3,15 @@ FROM public.ecr.aws/lts/ubuntu:20.04_stable
ARG DEBIAN_FRONTEND=noninteractive
## install things needed to run openscad (xvfb is an important one)
RUN apt-get update --fix-missing -qq
RUN apt-get update -qq
# double check this below, I'm not sure we need inkscape etc
RUN apt-get -y -qq install software-properties-common dirmngr apt-transport-https lsb-release ca-certificates xvfb imagemagick unzip inkscape
RUN apt-get install -y curl wget
RUN touch /etc/apt/sources.list.d/openscad.list
RUN echo "deb https://download.opensuse.org/repositories/home:/t-paul/xUbuntu_20.04/ ./" >> /etc/apt/sources.list.d/openscad.list
RUN wget -qO - https://files.openscad.org/OBS-Repository-Key.pub | apt-key add -
RUN apt-get update -qq
RUN apt-get install -y openscad-nightly
RUN apt-get install -y -qq openscad
RUN apt-get install -y curl
# install node14, see comment at the to of node14source_setup.sh
ADD api/src/docker/common/node14source_setup.sh /nodesource_setup.sh
ADD common/node14source_setup.sh /nodesource_setup.sh
RUN ["chmod", "+x", "/nodesource_setup.sh"]
RUN bash nodesource_setup.sh
RUN apt-get install -y nodejs
@@ -32,35 +29,18 @@ RUN apt-get update && \
# Add the lambda emulator for local dev, (see entrypoint.sh for where it's used),
# I have the file locally (gitignored) to speed up build times (as it downloads everytime),
# but you can use the http version of the below ADD command or download it yourself from that url.
ADD api/src/docker/common/aws-lambda-rie /usr/local/bin/aws-lambda-rie
ADD common/aws-lambda-rie /usr/local/bin/aws-lambda-rie
# ADD https://github.com/aws/aws-lambda-runtime-interface-emulator/releases/download/v1.0/aws-lambda-rie /usr/local/bin/aws-lambda-rie
RUN ["chmod", "+x", "/usr/local/bin/aws-lambda-rie"]
WORKDIR /var/task/
# aws-lambda-ric does not play nice with yarn, so installing it seperately,
# circle back to this later for a proper solution
COPY package*.json /var/task/
RUN npm install aws-lambda-ric@1.0.0
# Install OpenSCAD libraries
# It's experimental, so only adding latest Round-Anything for now
RUN echo "OPENSCADPATH=/var/task/openscad" >>/etc/profile && \
wget -P /var/task/openscad/ https://github.com/Irev-Dev/Round-Anything/archive/refs/tags/1.0.4.zip && \
unzip /var/task/openscad/1.0.4
# Add our own theming (based on DeepOcean with a different "background" and "opencsg-face-back")
COPY api/src/docker/openscad/cadhubtheme.json /usr/share/openscad-nightly/color-schemes/render/
RUN echo "cadhub-concat-split" > /var/task/cadhub-concat-split
# using built javascript from dist
# run `yarn rw build` and $(npm bin)/zip-it-and-ship-it api/dist/functions/ api/dist/zipball before bulding this image
COPY api/dist/zipball/openscad.zip /var/task/
# -n stops aws-lamda-ric from being overridden.
RUN unzip -n /var/task/openscad.zip
COPY api/src/docker/common/entrypoint.sh /entrypoint.sh
COPY openscad/package*.json /var/task/
RUN npm install
COPY openscad/*.js /var/task/
COPY common/*.js /var/common/
COPY common/entrypoint.sh /entrypoint.sh
RUN ["chmod", "+x", "/entrypoint.sh"]
ENTRYPOINT ["sh", "/entrypoint.sh"]
CMD [ "openscad.preview" ]
CMD [ "openscad.render" ]

View File

@@ -1,19 +0,0 @@
{
"name" : "CadHub",
"index" : 1600,
"show-in-gui" : true,
"colors" : {
"background" : "#1A1A1D",
"axes-color" : "#c1c1c1",
"opencsg-face-front" : "#eeeeee",
"opencsg-face-back" : "#8732F2",
"cgal-face-front" : "#eeeeee",
"cgal-face-back" : "#0babc8",
"cgal-face-2d" : "#9370db",
"cgal-edge-front" : "#0000ff",
"cgal-edge-back" : "#0000ff",
"cgal-edge-2d" : "#ff00ff",
"crosshair" : "#f0f0f0"
}
}

View File

@@ -0,0 +1,95 @@
const { runScad, stlExport } = require('./runScad')
const middy = require('middy')
const { cors } = require('middy/middlewares')
const health = async () => {
console.log('Health endpoint')
return {
statusCode: 200,
body: 'ok',
}
}
// cors true does not seem to work in serverless.yml, perhaps docker lambdas aren't covered by that config
// special lambda just for responding to options requests
const preflightOptions = (req, _context, callback) => {
const response = {
statusCode: 204,
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'POST',
'Access-Control-Allow-Headers': '*',
},
}
callback(null, response)
}
const preview = async (req, _context, callback) => {
_context.callbackWaitsForEmptyEventLoop = false
const eventBody = Buffer.from(req.body, 'base64').toString('ascii')
console.log(eventBody, 'eventBody')
const { file, settings } = JSON.parse(eventBody)
const { error, result, tempFile } = await runScad({ file, settings })
if (error) {
const response = {
statusCode: 400,
body: JSON.stringify({ error, tempFile }),
}
callback(null, response)
} else {
console.log(`got result in route: ${result}, file is: ${tempFile}`)
const fs = require('fs')
const image = fs.readFileSync(`/tmp/${tempFile}/output.png`, {
encoding: 'base64',
})
console.log(image, 'encoded image')
const response = {
statusCode: 200,
body: JSON.stringify({
imageBase64: image,
result,
tempFile,
}),
}
callback(null, response)
}
}
const stl = async (req, _context, callback) => {
_context.callbackWaitsForEmptyEventLoop = false
const eventBody = Buffer.from(req.body, 'base64').toString('ascii')
console.log(eventBody, 'eventBody')
const { file } = JSON.parse(eventBody)
const { error, result, tempFile } = await stlExport({ file })
if (error) {
const response = {
statusCode: 400,
body: { error, tempFile },
}
callback(null, response)
} else {
console.log(`got result in route: ${result}, file is: ${tempFile}`)
const fs = require('fs')
const stl = fs.readFileSync(`/tmp/${tempFile}/output.stl`, {
encoding: 'base64',
})
console.log('encoded stl', stl)
const response = {
statusCode: 200,
headers: {
'content-type': 'application/stl',
},
body: stl,
isBase64Encoded: true,
}
console.log('callback fired')
callback(null, response)
}
}
module.exports = {
health: middy(health).use(cors()),
stl: middy(stl).use(cors()),
preview: middy(preview).use(cors()),
preflightOptions,
}

View File

@@ -1,44 +0,0 @@
import { runScad, stlExport } from './runScad'
import middy from 'middy'
import { cors } from 'middy/middlewares'
import { loggerWrap, storeAssetAndReturnUrl } from '../common/utils'
const _preview = async (req, _context, callback) => {
_context.callbackWaitsForEmptyEventLoop = false
const eventBody = Buffer.from(req.body, 'base64').toString('ascii')
console.log('eventBody', eventBody)
const { file, settings } = JSON.parse(eventBody)
const { error, consoleMessage, fullPath } = await runScad({
file,
settings,
})
await storeAssetAndReturnUrl({
error,
callback,
fullPath,
consoleMessage,
})
}
const _stl = async (req, _context, callback) => {
_context.callbackWaitsForEmptyEventLoop = false
const eventBody = Buffer.from(req.body, 'base64').toString('ascii')
console.log(eventBody, 'eventBody')
const { file, settings } = JSON.parse(eventBody)
const { error, consoleMessage, fullPath } = await stlExport({
file,
settings,
})
await storeAssetAndReturnUrl({
error,
callback,
fullPath,
consoleMessage,
})
}
export const stl = middy(loggerWrap(_stl)).use(cors())
export const preview = middy(loggerWrap(_preview)).use(cors())

View File

@@ -0,0 +1,16 @@
{
"name": "openscad-endpoint",
"version": "0.0.1",
"description": "endpoint for openscad",
"main": "index.js",
"author": "Kurt Hutten <kurt@kurthutten.com>",
"license": "",
"dependencies": {
"cors": "^2.8.5",
"middy": "^0.36.0",
"nanoid": "^3.1.20"
},
"devDependencies": {
"aws-lambda-ric": "^1.0.0"
}
}

View File

@@ -0,0 +1,42 @@
const { makeFile, runCommand } = require('../common/utils')
const { nanoid } = require('nanoid')
module.exports.runScad = async ({
file,
settings: {
size: { x = 500, y = 500 } = {},
camera: {
position = { x: 40, y: 40, z: 40 },
rotation = { x: 55, y: 0, z: 25 },
dist = 200,
} = {},
} = {}, // TODO add view settings
} = {}) => {
const tempFile = await makeFile(file, '.scad', nanoid)
const { x: rx, y: ry, z: rz } = rotation
const { x: px, y: py, z: pz } = position
const cameraArg = `--camera=${px},${py},${pz},${rx},${ry},${rz},${dist}`
const command = `xvfb-run --auto-servernum --server-args "-screen 0 1024x768x24" openscad -o /tmp/${tempFile}/output.png ${cameraArg} --imgsize=${x},${y} --colorscheme DeepOcean /tmp/${tempFile}/main.scad`
console.log('command', command)
try {
const result = await runCommand(command, 15000)
return { result, tempFile }
} catch (error) {
return { error, tempFile }
}
}
module.exports.stlExport = async ({ file } = {}) => {
const tempFile = await makeFile(file, '.scad', nanoid)
try {
const result = await runCommand(
`openscad -o /tmp/${tempFile}/output.stl /tmp/${tempFile}/main.scad`,
300000 // lambda will time out before this, we might need to look at background jobs if we do git integration stl generation
)
return { result, tempFile }
} catch (error) {
return { error, tempFile }
}
}

View File

@@ -1,150 +0,0 @@
import { writeFiles, runCommand } from '../common/utils'
import { nanoid } from 'nanoid'
const { readFile } = require('fs/promises')
const OPENSCAD_COMMON = `xvfb-run --auto-servernum --server-args "-screen 0 1024x768x24" openscad-nightly`
/** Removes our generated/hash filename with just "main.scad", so that it's a nice message in the IDE */
const cleanOpenScadError = (error) =>
error.replace(/["|']\/tmp\/.+\/main.scad["|']/g, "'main.scad'")
export const runScad = async ({
file,
settings: {
viewAll = false,
size: { x = 500, y = 500 } = {},
parameters,
camera: {
position = { x: 40, y: 40, z: 40 },
rotation = { x: 55, y: 0, z: 25 },
dist = 200,
} = {},
} = {}, // TODO add view settings
} = {}): Promise<{
error?: string
consoleMessage?: string
fullPath?: string
customizerPath?: string
}> => {
const tempFile = await writeFiles(
[
{ file, fileName: 'main.scad' },
{
file: JSON.stringify({
parameterSets: { default: parameters },
fileFormatVersion: '1',
}),
fileName: 'params.json',
},
],
'a' + nanoid() // 'a' ensure nothing funny happens if it start with a bad character like "-", maybe I should pick a safer id generator :shrug:
)
const { x: rx, y: ry, z: rz } = rotation
const { x: px, y: py, z: pz } = position
const cameraArg = `--camera=${px},${py},${pz},${rx},${ry},${rz},${dist}`
const fullPath = `/tmp/${tempFile}/output.gz`
const imPath = `/tmp/${tempFile}/output.png`
const customizerPath = `/tmp/${tempFile}/customizer.param`
const summaryPath = `/tmp/${tempFile}/summary.json` // contains camera info
const command = [
OPENSCAD_COMMON,
`-o ${customizerPath}`,
`-o ${imPath}`,
`--summary camera --summary-file ${summaryPath}`,
viewAll ? '--viewall' : '',
`-p /tmp/${tempFile}/params.json -P default`,
cameraArg,
`--imgsize=${x},${y}`,
`--colorscheme CadHub`,
`/tmp/${tempFile}/main.scad`,
].join(' ')
console.log('command', command)
try {
const consoleMessage = await runCommand(command, 15000)
const files: string[] = await Promise.all(
[customizerPath, summaryPath].map((path) =>
readFile(path, { encoding: 'ascii' })
)
)
const [params, cameraInfo] = files.map((fileStr: string) =>
JSON.parse(fileStr)
)
await writeFiles(
[
{
file: JSON.stringify({
cameraInfo: viewAll ? cameraInfo.camera : undefined,
customizerParams: params.parameters,
consoleMessage,
type: 'png',
}),
fileName: 'metadata.json',
},
],
tempFile
)
await runCommand(
`cat ${imPath} /var/task/cadhub-concat-split /tmp/${tempFile}/metadata.json | gzip > ${fullPath}`,
15000
)
return { consoleMessage, fullPath, customizerPath }
} catch (dirtyError) {
return { error: cleanOpenScadError(dirtyError) }
}
}
export const stlExport = async ({ file, settings: { parameters } } = {}) => {
const tempFile = await writeFiles(
[
{ file, fileName: 'main.scad' },
{
file: JSON.stringify({
parameterSets: { default: parameters },
fileFormatVersion: '1',
}),
fileName: 'params.json',
},
],
'a' + nanoid() // 'a' ensure nothing funny happens if it start with a bad character like "-", maybe I should pick a safer id generator :shrug:
)
const fullPath = `/tmp/${tempFile}/output.gz`
const stlPath = `/tmp/${tempFile}/output.stl`
const customizerPath = `/tmp/${tempFile}/customizer.param`
const command = [
OPENSCAD_COMMON,
// `--export-format=binstl`,
`-o ${customizerPath}`,
`-o ${stlPath}`,
`-p /tmp/${tempFile}/params.json -P default`,
`/tmp/${tempFile}/main.scad`,
].join(' ')
try {
// lambda will time out before this, we might need to look at background jobs if we do git integration stl generation
const consoleMessage = await runCommand(command, 60000)
const params = JSON.parse(
await readFile(customizerPath, { encoding: 'ascii' })
).parameters
await writeFiles(
[
{
file: JSON.stringify({
customizerParams: params,
consoleMessage,
type: 'stl',
}),
fileName: 'metadata.json',
},
],
tempFile
)
await runCommand(
`cat ${stlPath} /var/task/cadhub-concat-split /tmp/${tempFile}/metadata.json | gzip > ${fullPath}`,
15000
)
return { consoleMessage, fullPath, customizerPath }
} catch (error) {
return { error, fullPath }
}
}

View File

@@ -0,0 +1,386 @@
# THIS IS AN AUTOGENERATED FILE. DO NOT EDIT THIS FILE DIRECTLY.
# yarn lockfile v1
accepts@~1.3.7:
version "1.3.7"
resolved "https://registry.yarnpkg.com/accepts/-/accepts-1.3.7.tgz#531bc726517a3b2b41f850021c6cc15eaab507cd"
integrity sha512-Il80Qs2WjYlJIBNzNkK6KYqlVMTbZLXgHx2oT0pU/fjRHyEp+PEfEPY0R3WCwAGVOtauxh1hOxNgIf5bv7dQpA==
dependencies:
mime-types "~2.1.24"
negotiator "0.6.2"
array-flatten@1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/array-flatten/-/array-flatten-1.1.1.tgz#9a5f699051b1e7073328f2a008968b64ea2955d2"
integrity sha1-ml9pkFGx5wczKPKgCJaLZOopVdI=
body-parser@1.19.0:
version "1.19.0"
resolved "https://registry.yarnpkg.com/body-parser/-/body-parser-1.19.0.tgz#96b2709e57c9c4e09a6fd66a8fd979844f69f08a"
integrity sha512-dhEPs72UPbDnAQJ9ZKMNTP6ptJaionhP5cBb541nXPlW60Jepo9RV/a4fX4XWW9CuFNK22krhrj1+rgzifNCsw==
dependencies:
bytes "3.1.0"
content-type "~1.0.4"
debug "2.6.9"
depd "~1.1.2"
http-errors "1.7.2"
iconv-lite "0.4.24"
on-finished "~2.3.0"
qs "6.7.0"
raw-body "2.4.0"
type-is "~1.6.17"
bytes@3.1.0:
version "3.1.0"
resolved "https://registry.yarnpkg.com/bytes/-/bytes-3.1.0.tgz#f6cf7933a360e0588fa9fde85651cdc7f805d1f6"
integrity sha512-zauLjrfCG+xvoyaqLoV8bLVXXNGC4JqlxFCutSDWA6fJrTo2ZuvLYTqZ7aHBLZSMOopbzwv8f+wZcVzfVTI2Dg==
content-disposition@0.5.3:
version "0.5.3"
resolved "https://registry.yarnpkg.com/content-disposition/-/content-disposition-0.5.3.tgz#e130caf7e7279087c5616c2007d0485698984fbd"
integrity sha512-ExO0774ikEObIAEV9kDo50o+79VCUdEB6n6lzKgGwupcVeRlhrj3qGAfwq8G6uBJjkqLrhT0qEYFcWng8z1z0g==
dependencies:
safe-buffer "5.1.2"
content-type@~1.0.4:
version "1.0.4"
resolved "https://registry.yarnpkg.com/content-type/-/content-type-1.0.4.tgz#e138cc75e040c727b1966fe5e5f8c9aee256fe3b"
integrity sha512-hIP3EEPs8tB9AT1L+NUqtwOAps4mk2Zob89MWXMHjHWg9milF/j4osnnQLXBCBFBk/tvIG/tUc9mOUJiPBhPXA==
cookie-signature@1.0.6:
version "1.0.6"
resolved "https://registry.yarnpkg.com/cookie-signature/-/cookie-signature-1.0.6.tgz#e303a882b342cc3ee8ca513a79999734dab3ae2c"
integrity sha1-4wOogrNCzD7oylE6eZmXNNqzriw=
cookie@0.4.0:
version "0.4.0"
resolved "https://registry.yarnpkg.com/cookie/-/cookie-0.4.0.tgz#beb437e7022b3b6d49019d088665303ebe9c14ba"
integrity sha512-+Hp8fLp57wnUSt0tY0tHEXh4voZRDnoIrZPqlo3DPiI4y9lwg/jqx+1Om94/W6ZaPDOUbnjOt/99w66zk+l1Xg==
cors@^2.8.5:
version "2.8.5"
resolved "https://registry.yarnpkg.com/cors/-/cors-2.8.5.tgz#eac11da51592dd86b9f06f6e7ac293b3df875d29"
integrity sha512-KIHbLJqu73RGr/hnbrO9uBeixNGuvSQjul/jdFvS/KFSIH1hWVd1ng7zOHx+YrEfInLG7q4n6GHQ9cDtxv/P6g==
dependencies:
object-assign "^4"
vary "^1"
debug@2.6.9:
version "2.6.9"
resolved "https://registry.yarnpkg.com/debug/-/debug-2.6.9.tgz#5d128515df134ff327e90a4c93f4e077a536341f"
integrity sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==
dependencies:
ms "2.0.0"
depd@~1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/depd/-/depd-1.1.2.tgz#9bcd52e14c097763e749b274c4346ed2e560b5a9"
integrity sha1-m81S4UwJd2PnSbJ0xDRu0uVgtak=
destroy@~1.0.4:
version "1.0.4"
resolved "https://registry.yarnpkg.com/destroy/-/destroy-1.0.4.tgz#978857442c44749e4206613e37946205826abd80"
integrity sha1-l4hXRCxEdJ5CBmE+N5RiBYJqvYA=
ee-first@1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/ee-first/-/ee-first-1.1.1.tgz#590c61156b0ae2f4f0255732a158b266bc56b21d"
integrity sha1-WQxhFWsK4vTwJVcyoViyZrxWsh0=
encodeurl@~1.0.2:
version "1.0.2"
resolved "https://registry.yarnpkg.com/encodeurl/-/encodeurl-1.0.2.tgz#ad3ff4c86ec2d029322f5a02c3a9a606c95b3f59"
integrity sha1-rT/0yG7C0CkyL1oCw6mmBslbP1k=
escape-html@~1.0.3:
version "1.0.3"
resolved "https://registry.yarnpkg.com/escape-html/-/escape-html-1.0.3.tgz#0258eae4d3d0c0974de1c169188ef0051d1d1988"
integrity sha1-Aljq5NPQwJdN4cFpGI7wBR0dGYg=
etag@~1.8.1:
version "1.8.1"
resolved "https://registry.yarnpkg.com/etag/-/etag-1.8.1.tgz#41ae2eeb65efa62268aebfea83ac7d79299b0887"
integrity sha1-Qa4u62XvpiJorr/qg6x9eSmbCIc=
express@^4.17.1:
version "4.17.1"
resolved "https://registry.yarnpkg.com/express/-/express-4.17.1.tgz#4491fc38605cf51f8629d39c2b5d026f98a4c134"
integrity sha512-mHJ9O79RqluphRrcw2X/GTh3k9tVv8YcoyY4Kkh4WDMUYKRZUq0h1o0w2rrrxBqM7VoeUVqgb27xlEMXTnYt4g==
dependencies:
accepts "~1.3.7"
array-flatten "1.1.1"
body-parser "1.19.0"
content-disposition "0.5.3"
content-type "~1.0.4"
cookie "0.4.0"
cookie-signature "1.0.6"
debug "2.6.9"
depd "~1.1.2"
encodeurl "~1.0.2"
escape-html "~1.0.3"
etag "~1.8.1"
finalhandler "~1.1.2"
fresh "0.5.2"
merge-descriptors "1.0.1"
methods "~1.1.2"
on-finished "~2.3.0"
parseurl "~1.3.3"
path-to-regexp "0.1.7"
proxy-addr "~2.0.5"
qs "6.7.0"
range-parser "~1.2.1"
safe-buffer "5.1.2"
send "0.17.1"
serve-static "1.14.1"
setprototypeof "1.1.1"
statuses "~1.5.0"
type-is "~1.6.18"
utils-merge "1.0.1"
vary "~1.1.2"
finalhandler@~1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/finalhandler/-/finalhandler-1.1.2.tgz#b7e7d000ffd11938d0fdb053506f6ebabe9f587d"
integrity sha512-aAWcW57uxVNrQZqFXjITpW3sIUQmHGG3qSb9mUah9MgMC4NeWhNOlNjXEYq3HjRAvL6arUviZGGJsBg6z0zsWA==
dependencies:
debug "2.6.9"
encodeurl "~1.0.2"
escape-html "~1.0.3"
on-finished "~2.3.0"
parseurl "~1.3.3"
statuses "~1.5.0"
unpipe "~1.0.0"
forwarded@~0.1.2:
version "0.1.2"
resolved "https://registry.yarnpkg.com/forwarded/-/forwarded-0.1.2.tgz#98c23dab1175657b8c0573e8ceccd91b0ff18c84"
integrity sha1-mMI9qxF1ZXuMBXPozszZGw/xjIQ=
fresh@0.5.2:
version "0.5.2"
resolved "https://registry.yarnpkg.com/fresh/-/fresh-0.5.2.tgz#3d8cadd90d976569fa835ab1f8e4b23a105605a7"
integrity sha1-PYyt2Q2XZWn6g1qx+OSyOhBWBac=
http-errors@1.7.2:
version "1.7.2"
resolved "https://registry.yarnpkg.com/http-errors/-/http-errors-1.7.2.tgz#4f5029cf13239f31036e5b2e55292bcfbcc85c8f"
integrity sha512-uUQBt3H/cSIVfch6i1EuPNy/YsRSOUBXTVfZ+yR7Zjez3qjBz6i9+i4zjNaoqcoFVI4lQJ5plg63TvGfRSDCRg==
dependencies:
depd "~1.1.2"
inherits "2.0.3"
setprototypeof "1.1.1"
statuses ">= 1.5.0 < 2"
toidentifier "1.0.0"
http-errors@~1.7.2:
version "1.7.3"
resolved "https://registry.yarnpkg.com/http-errors/-/http-errors-1.7.3.tgz#6c619e4f9c60308c38519498c14fbb10aacebb06"
integrity sha512-ZTTX0MWrsQ2ZAhA1cejAwDLycFsd7I7nVtnkT3Ol0aqodaKW+0CTZDQ1uBv5whptCnc8e8HeRRJxRs0kmm/Qfw==
dependencies:
depd "~1.1.2"
inherits "2.0.4"
setprototypeof "1.1.1"
statuses ">= 1.5.0 < 2"
toidentifier "1.0.0"
iconv-lite@0.4.24:
version "0.4.24"
resolved "https://registry.yarnpkg.com/iconv-lite/-/iconv-lite-0.4.24.tgz#2022b4b25fbddc21d2f524974a474aafe733908b"
integrity sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==
dependencies:
safer-buffer ">= 2.1.2 < 3"
inherits@2.0.3:
version "2.0.3"
resolved "https://registry.yarnpkg.com/inherits/-/inherits-2.0.3.tgz#633c2c83e3da42a502f52466022480f4208261de"
integrity sha1-Yzwsg+PaQqUC9SRmAiSA9CCCYd4=
inherits@2.0.4:
version "2.0.4"
resolved "https://registry.yarnpkg.com/inherits/-/inherits-2.0.4.tgz#0fa2c64f932917c3433a0ded55363aae37416b7c"
integrity sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==
ipaddr.js@1.9.1:
version "1.9.1"
resolved "https://registry.yarnpkg.com/ipaddr.js/-/ipaddr.js-1.9.1.tgz#bff38543eeb8984825079ff3a2a8e6cbd46781b3"
integrity sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==
media-typer@0.3.0:
version "0.3.0"
resolved "https://registry.yarnpkg.com/media-typer/-/media-typer-0.3.0.tgz#8710d7af0aa626f8fffa1ce00168545263255748"
integrity sha1-hxDXrwqmJvj/+hzgAWhUUmMlV0g=
merge-descriptors@1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/merge-descriptors/-/merge-descriptors-1.0.1.tgz#b00aaa556dd8b44568150ec9d1b953f3f90cbb61"
integrity sha1-sAqqVW3YtEVoFQ7J0blT8/kMu2E=
methods@~1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/methods/-/methods-1.1.2.tgz#5529a4d67654134edcc5266656835b0f851afcee"
integrity sha1-VSmk1nZUE07cxSZmVoNbD4Ua/O4=
mime-db@1.46.0:
version "1.46.0"
resolved "https://registry.yarnpkg.com/mime-db/-/mime-db-1.46.0.tgz#6267748a7f799594de3cbc8cde91def349661cee"
integrity sha512-svXaP8UQRZ5K7or+ZmfNhg2xX3yKDMUzqadsSqi4NCH/KomcH75MAMYAGVlvXn4+b/xOPhS3I2uHKRUzvjY7BQ==
mime-types@~2.1.24:
version "2.1.29"
resolved "https://registry.yarnpkg.com/mime-types/-/mime-types-2.1.29.tgz#1d4ab77da64b91f5f72489df29236563754bb1b2"
integrity sha512-Y/jMt/S5sR9OaqteJtslsFZKWOIIqMACsJSiHghlCAyhf7jfVYjKBmLiX8OgpWeW+fjJ2b+Az69aPFPkUOY6xQ==
dependencies:
mime-db "1.46.0"
mime@1.6.0:
version "1.6.0"
resolved "https://registry.yarnpkg.com/mime/-/mime-1.6.0.tgz#32cd9e5c64553bd58d19a568af452acff04981b1"
integrity sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==
ms@2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/ms/-/ms-2.0.0.tgz#5608aeadfc00be6c2901df5f9861788de0d597c8"
integrity sha1-VgiurfwAvmwpAd9fmGF4jeDVl8g=
ms@2.1.1:
version "2.1.1"
resolved "https://registry.yarnpkg.com/ms/-/ms-2.1.1.tgz#30a5864eb3ebb0a66f2ebe6d727af06a09d86e0a"
integrity sha512-tgp+dl5cGk28utYktBsrFqA7HKgrhgPsg6Z/EfhWI4gl1Hwq8B/GmY/0oXZ6nF8hDVesS/FpnYaD/kOWhYQvyg==
nanoid@^3.1.20:
version "3.1.20"
resolved "https://registry.yarnpkg.com/nanoid/-/nanoid-3.1.20.tgz#badc263c6b1dcf14b71efaa85f6ab4c1d6cfc788"
integrity sha512-a1cQNyczgKbLX9jwbS/+d7W8fX/RfgYR7lVWwWOGIPNgK2m0MWvrGF6/m4kk6U3QcFMnZf3RIhL0v2Jgh/0Uxw==
negotiator@0.6.2:
version "0.6.2"
resolved "https://registry.yarnpkg.com/negotiator/-/negotiator-0.6.2.tgz#feacf7ccf525a77ae9634436a64883ffeca346fb"
integrity sha512-hZXc7K2e+PgeI1eDBe/10Ard4ekbfrrqG8Ep+8Jmf4JID2bNg7NvCPOZN+kfF574pFQI7mum2AUqDidoKqcTOw==
object-assign@^4:
version "4.1.1"
resolved "https://registry.yarnpkg.com/object-assign/-/object-assign-4.1.1.tgz#2109adc7965887cfc05cbbd442cac8bfbb360863"
integrity sha1-IQmtx5ZYh8/AXLvUQsrIv7s2CGM=
on-finished@~2.3.0:
version "2.3.0"
resolved "https://registry.yarnpkg.com/on-finished/-/on-finished-2.3.0.tgz#20f1336481b083cd75337992a16971aa2d906947"
integrity sha1-IPEzZIGwg811M3mSoWlxqi2QaUc=
dependencies:
ee-first "1.1.1"
parseurl@~1.3.3:
version "1.3.3"
resolved "https://registry.yarnpkg.com/parseurl/-/parseurl-1.3.3.tgz#9da19e7bee8d12dff0513ed5b76957793bc2e8d4"
integrity sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==
path-to-regexp@0.1.7:
version "0.1.7"
resolved "https://registry.yarnpkg.com/path-to-regexp/-/path-to-regexp-0.1.7.tgz#df604178005f522f15eb4490e7247a1bfaa67f8c"
integrity sha1-32BBeABfUi8V60SQ5yR6G/qmf4w=
proxy-addr@~2.0.5:
version "2.0.6"
resolved "https://registry.yarnpkg.com/proxy-addr/-/proxy-addr-2.0.6.tgz#fdc2336505447d3f2f2c638ed272caf614bbb2bf"
integrity sha512-dh/frvCBVmSsDYzw6n926jv974gddhkFPfiN8hPOi30Wax25QZyZEGveluCgliBnqmuM+UJmBErbAUFIoDbjOw==
dependencies:
forwarded "~0.1.2"
ipaddr.js "1.9.1"
qs@6.7.0:
version "6.7.0"
resolved "https://registry.yarnpkg.com/qs/-/qs-6.7.0.tgz#41dc1a015e3d581f1621776be31afb2876a9b1bc"
integrity sha512-VCdBRNFTX1fyE7Nb6FYoURo/SPe62QCaAyzJvUjwRaIsc+NePBEniHlvxFmmX56+HZphIGtV0XeCirBtpDrTyQ==
range-parser@~1.2.1:
version "1.2.1"
resolved "https://registry.yarnpkg.com/range-parser/-/range-parser-1.2.1.tgz#3cf37023d199e1c24d1a55b84800c2f3e6468031"
integrity sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==
raw-body@2.4.0:
version "2.4.0"
resolved "https://registry.yarnpkg.com/raw-body/-/raw-body-2.4.0.tgz#a1ce6fb9c9bc356ca52e89256ab59059e13d0332"
integrity sha512-4Oz8DUIwdvoa5qMJelxipzi/iJIi40O5cGV1wNYp5hvZP8ZN0T+jiNkL0QepXs+EsQ9XJ8ipEDoiH70ySUJP3Q==
dependencies:
bytes "3.1.0"
http-errors "1.7.2"
iconv-lite "0.4.24"
unpipe "1.0.0"
safe-buffer@5.1.2:
version "5.1.2"
resolved "https://registry.yarnpkg.com/safe-buffer/-/safe-buffer-5.1.2.tgz#991ec69d296e0313747d59bdfd2b745c35f8828d"
integrity sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==
"safer-buffer@>= 2.1.2 < 3":
version "2.1.2"
resolved "https://registry.yarnpkg.com/safer-buffer/-/safer-buffer-2.1.2.tgz#44fa161b0187b9549dd84bb91802f9bd8385cd6a"
integrity sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==
send@0.17.1:
version "0.17.1"
resolved "https://registry.yarnpkg.com/send/-/send-0.17.1.tgz#c1d8b059f7900f7466dd4938bdc44e11ddb376c8"
integrity sha512-BsVKsiGcQMFwT8UxypobUKyv7irCNRHk1T0G680vk88yf6LBByGcZJOTJCrTP2xVN6yI+XjPJcNuE3V4fT9sAg==
dependencies:
debug "2.6.9"
depd "~1.1.2"
destroy "~1.0.4"
encodeurl "~1.0.2"
escape-html "~1.0.3"
etag "~1.8.1"
fresh "0.5.2"
http-errors "~1.7.2"
mime "1.6.0"
ms "2.1.1"
on-finished "~2.3.0"
range-parser "~1.2.1"
statuses "~1.5.0"
serve-static@1.14.1:
version "1.14.1"
resolved "https://registry.yarnpkg.com/serve-static/-/serve-static-1.14.1.tgz#666e636dc4f010f7ef29970a88a674320898b2f9"
integrity sha512-JMrvUwE54emCYWlTI+hGrGv5I8dEwmco/00EvkzIIsR7MqrHonbD9pO2MOfFnpFntl7ecpZs+3mW+XbQZu9QCg==
dependencies:
encodeurl "~1.0.2"
escape-html "~1.0.3"
parseurl "~1.3.3"
send "0.17.1"
setprototypeof@1.1.1:
version "1.1.1"
resolved "https://registry.yarnpkg.com/setprototypeof/-/setprototypeof-1.1.1.tgz#7e95acb24aa92f5885e0abef5ba131330d4ae683"
integrity sha512-JvdAWfbXeIGaZ9cILp38HntZSFSo3mWg6xGcJJsd+d4aRMOqauag1C63dJfDw7OaMYwEbHMOxEZ1lqVRYP2OAw==
"statuses@>= 1.5.0 < 2", statuses@~1.5.0:
version "1.5.0"
resolved "https://registry.yarnpkg.com/statuses/-/statuses-1.5.0.tgz#161c7dac177659fd9811f43771fa99381478628c"
integrity sha1-Fhx9rBd2Wf2YEfQ3cfqZOBR4Yow=
toidentifier@1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/toidentifier/-/toidentifier-1.0.0.tgz#7e1be3470f1e77948bc43d94a3c8f4d7752ba553"
integrity sha512-yaOH/Pk/VEhBWWTlhI+qXxDFXlejDGcQipMlyxda9nthulaxLZUNcUqFxokp0vcYnvteJln5FNQDRrxj3YcbVw==
type-is@~1.6.17, type-is@~1.6.18:
version "1.6.18"
resolved "https://registry.yarnpkg.com/type-is/-/type-is-1.6.18.tgz#4e552cd05df09467dcbc4ef739de89f2cf37c131"
integrity sha512-TkRKr9sUTxEH8MdfuCSP7VizJyzRNMjj2J2do2Jr3Kym598JVdEksuzPQCnlFPW4ky9Q+iA+ma9BGm06XQBy8g==
dependencies:
media-typer "0.3.0"
mime-types "~2.1.24"
unpipe@1.0.0, unpipe@~1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/unpipe/-/unpipe-1.0.0.tgz#b2bf4ee8514aae6165b4817829d21b2ef49904ec"
integrity sha1-sr9O6FFKrmFltIF4KdIbLvSZBOw=
utils-merge@1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/utils-merge/-/utils-merge-1.0.1.tgz#9f95710f50a267947b2ccc124741c1028427e713"
integrity sha1-n5VxD1CiZ5R7LMwSR0HBAoQn5xM=
vary@^1, vary@~1.1.2:
version "1.1.2"
resolved "https://registry.yarnpkg.com/vary/-/vary-1.1.2.tgz#2299f02c6ded30d4a5961b0b9f74524a18f634fc"
integrity sha1-IpnwLG3tMNSllhsLn3RSShj2NPw=

View File

@@ -0,0 +1,20 @@
{
"name": "aws-emulator",
"version": "1.0.0",
"description": "thin layer so that we can use docker lambdas locally",
"scripts": {
"lambdas": "docker-compose up --build",
"emulate": "nodemon ./aws-emulator.js",
"watch": "concurrently \"yarn lambdas\" \"yarn emulate\""
},
"main": "aws-emulator.js",
"dependencies": {
"axios": "^0.21.1",
"cors": "^2.8.5",
"express": "^4.17.1"
},
"devDependencies": {
"concurrently": "^6.0.0",
"nodemon": "^2.0.7"
}
}

View File

@@ -0,0 +1,179 @@
service: cad-lambdas
# app and org for use with dashboard.serverless.com
#app: your-app-name
#org: your-org-name
# plugins:
# - serverless-offline
# You can pin your service to only deploy with a specific Serverless version
# Check out our docs for more details
frameworkVersion: '2'
provider:
name: aws
lambdaHashingVersion: 20201221
ecr:
images:
# this image is built locally and push to ECR
openscadimage:
path: ./
file: ./openscad/Dockerfile
cadqueryimage:
path: ./
file: ./cadquery/Dockerfile
apiGateway:
metrics: true
binaryMediaTypes:
# we need to allow binary types to be able to send back images and stls, but it would be better to be more specific
# ie image/png etc. as */* treats everything as binary including the json body as the input the lambdas
# which mean we need to decode the input bode from base64, but the images break with anything other than */* :(
- '*/*'
# you can overwrite defaults here
# stage: dev
# region: us-east-1
# you can add statements to the Lambda function's IAM Role here
# iamRoleStatements:
# - Effect: "Allow"
# Action:
# - "s3:ListBucket"
# Resource: { "Fn::Join" : ["", ["arn:aws:s3:::", { "Ref" : "ServerlessDeploymentBucket" } ] ] }
# - Effect: "Allow"
# Action:
# - "s3:PutObject"
# Resource:
# Fn::Join:
# - ""
# - - "arn:aws:s3:::"
# - "Ref" : "ServerlessDeploymentBucket"
# - "/*"
# you can define service wide environment variables here
# environment:
# variable1: value1
functions:
# see preflightoptions comment in openscad.js
preflightopenscadpreview:
image:
name: openscadimage
command:
- openscad.preflightOptions
entryPoint:
- '/entrypoint.sh'
events:
- http:
path: openscad/preview
method: options
preflightopenscadstl:
image:
name: openscadimage
command:
- openscad.preflightOptions
entryPoint:
- '/entrypoint.sh'
events:
- http:
path: openscad/stl
method: options
openscadpreview:
image:
name: openscadimage
command:
- openscad.preview
entryPoint:
- '/entrypoint.sh'
events:
- http:
path: openscad/preview
method: post
timeout: 15
openscadstl:
image:
name: openscadimage
command:
- openscad.stl
entryPoint:
- '/entrypoint.sh'
events:
- http:
path: openscad/stl
method: post
timeout: 30
preflightcadquerystl:
image:
name: cadqueryimage
command:
- cadquery.preflightOptions
entryPoint:
- '/entrypoint.sh'
events:
- http:
path: cadquery/stl
method: options
cadquerystl:
image:
name: cadqueryimage
command:
- cadquery.stl
entryPoint:
- '/entrypoint.sh'
events:
- http:
path: cadquery/stl
method: post
timeout: 30
# The following are a few example events you can configure
# NOTE: Please make sure to change your handler code to work with those events
# Check the event documentation for details
# events:
# - httpApi:
# path: /users/create
# method: get
# - websocket: $connect
# - s3: ${env:BUCKET}
# - schedule: rate(10 minutes)
# - sns: greeter-topic
# - stream: arn:aws:dynamodb:region:XXXXXX:table/foo/stream/1970-01-01T00:00:00.000
# - alexaSkill: amzn1.ask.skill.xx-xx-xx-xx
# - alexaSmartHome: amzn1.ask.skill.xx-xx-xx-xx
# - iot:
# sql: "SELECT * FROM 'some_topic'"
# - cloudwatchEvent:
# event:
# source:
# - "aws.ec2"
# detail-type:
# - "EC2 Instance State-change Notification"
# detail:
# state:
# - pending
# - cloudwatchLog: '/aws/lambda/hello'
# - cognitoUserPool:
# pool: MyUserPool
# trigger: PreSignUp
# - alb:
# listenerArn: arn:aws:elasticloadbalancing:us-east-1:XXXXXX:listener/app/my-load-balancer/50dc6c495c0c9188/
# priority: 1
# conditions:
# host: example.com
# path: /hello
# Define function environment variables here
# environment:
# variable2: value2
# you can add CloudFormation resource templates here
#resources:
# Resources:
# NewResource:
# Type: AWS::S3::Bucket
# Properties:
# BucketName: my-new-bucket
# Outputs:
# NewOutput:
# Description: "Description for the output"
# Value: "Some output value"

View File

@@ -1,3 +0,0 @@
import { stl } from 'src/docker/cadquery/cadquery'
export { stl }

View File

@@ -1,40 +0,0 @@
import type { APIGatewayEvent /*, Context*/ } from 'aws-lambda'
import { logger } from 'src/lib/logger'
import { db } from 'src/lib/db'
/**
* The handler function is your code that processes http request events.
* You can use return and throw to send a response or error, respectively.
*
* Important: When deployed, a custom serverless function is an open API endpoint and
* is your responsibility to secure appropriately.
*
* @see {@link https://redwoodjs.com/docs/serverless-functions#security-considerations|Serverless Function Considerations}
* in the RedwoodJS documentation for more information.
*
* @typedef { import('aws-lambda').APIGatewayEvent } APIGatewayEvent
* @typedef { import('aws-lambda').Context } Context
* @param { APIGatewayEvent } event - an object which contains information from the invoker.
* @param { Context } context - contains information about the invocation,
* function, and execution environment.
*/
export const handler = async (event: APIGatewayEvent /*context: Context*/) => {
logger.info('Invoked checkUserName function')
const userName = event.queryStringParameters.username
let isUserNameAvailable = false
try {
const user = await db.user.findUnique({ where: { userName } })
isUserNameAvailable = !user
} catch (error) {
isUserNameAvailable = false
}
return {
statusCode: 200,
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
isUserNameAvailable,
}),
}
}

View File

@@ -0,0 +1,75 @@
/* for local development
Install and run smee (point at this function)
```
yarn global add smee-client
smee --url https://smee.io/3zgDJiGO8TW7nvf --path /.netlify/functions/event_handler --port 8910
```
*/
import { createHmac } from 'crypto'
import { App } from '@octokit/app'
import type { Endpoints } from '@octokit/types'
import type { PullRequestEvent } from '@octokit/webhooks-types'
const app = new App({
privateKey: process.env.GITHUB_APP_PRIVATE_KEY,
appId: process.env.GITHUB_APP_ID,
webhooks: {
secret: process.env.GITHUB_APP_SECRET,
},
})
const signRequestBody = (secret: string, body: string): string =>
'sha256=' + createHmac('sha256', secret).update(body, 'utf-8').digest('hex')
const writePullRequestComment = async ({
event,
message,
}: {
event: PullRequestEvent
message: string
}): Promise<
Endpoints['POST /repos/{owner}/{repo}/issues/{issue_number}/comments']['response']
> => {
const octokit = await app.getInstallationOctokit(event.installation.id)
return octokit.request(
'POST /repos/{owner}/{repo}/issues/{issue_number}/comments',
{
owner: event.repository.owner.login,
repo: event.repository.name,
issue_number: event.number,
body: message,
}
)
}
export const handler = async (req: {
body: string
headers: {
'x-hub-signature-256': string
'x-github-event': string
}
}) => {
const theirSignature = req.headers['x-hub-signature-256']
const ourSignature = signRequestBody(process.env.GITHUB_APP_SECRET, req.body)
if (theirSignature !== ourSignature) {
return {
statusCode: 401,
body: 'Bad signature',
}
}
const eventType = req.headers['x-github-event']
if (eventType !== 'pull_request') {
return { statusCode: 200 }
}
const event: PullRequestEvent = JSON.parse(req.body)
if (['reopened', 'opened'].includes(event.action)) {
await writePullRequestComment({
event,
message: 'Salutations, what a fine PR you have here.',
})
}
return {
statusCode: 200,
}
}

View File

@@ -0,0 +1,23 @@
import {
createGraphQLHandler,
makeMergedSchema,
makeServices,
} from '@redwoodjs/api'
import schemas from 'src/graphql/**/*.{js,ts}'
import services from 'src/services/**/*.{js,ts}'
import { getCurrentUser } from 'src/lib/auth'
import { db } from 'src/lib/db'
export const handler = createGraphQLHandler({
getCurrentUser,
schema: makeMergedSchema({
schemas,
services: makeServices({ services }),
}),
onException: () => {
// Disconnect from your database with an unhandled exception.
db.$disconnect()
},
})

View File

@@ -1,28 +0,0 @@
import { createGraphQLHandler } from '@redwoodjs/graphql-server'
import { createSentryApolloPlugin } from 'src/lib/sentry'
import { logger } from 'src/lib/logger'
import directives from 'src/directives/**/*.{js,ts}'
import sdls from 'src/graphql/**/*.sdl.{js,ts}'
import services from 'src/services/**/*.{js,ts}'
import { getCurrentUser } from 'src/lib/auth'
import { db } from 'src/lib/db'
export const handler = createGraphQLHandler({
loggerConfig: { logger, options: {} },
getCurrentUser,
directives,
sdls,
services,
plugins: [createSentryApolloPlugin()],
cors: {
origin: '*',
credentials: true,
},
onException: () => {
// Disconnect from your database with an unhandled exception.
db.$disconnect()
},
})

View File

@@ -1,11 +1,8 @@
import { createUserInsecure } from 'src/services/users/users'
import { createUserInsecure } from 'src/services/users/users.js'
import { db } from 'src/lib/db'
import { sentryWrapper } from 'src/lib/sentry'
import { enforceAlphaNumeric, generateUniqueString } from 'src/services/helpers'
import 'graphql-tag'
import { sendMail } from 'src/lib/sendmail'
const unWrappedHandler = async (req, _context) => {
export const handler = async (req, _context) => {
const body = JSON.parse(req.body)
console.log(body)
console.log(_context)
@@ -57,7 +54,7 @@ const unWrappedHandler = async (req, _context) => {
const user = body.user
const email = user.email
const roles = []
let roles = []
if (eventType === 'signup') {
roles.push('user')
@@ -67,53 +64,13 @@ const unWrappedHandler = async (req, _context) => {
})
const userNameSeed = enforceAlphaNumeric(user?.user_metadata?.userName)
const userName = await generateUniqueString(userNameSeed, isUniqueCallback) // TODO maybe come up with a better default userName?
const name = user?.user_metadata?.full_name
const input = {
email,
userName,
name,
name: user?.user_metadata?.full_name,
id: user.id,
}
await createUserInsecure({ input })
const kurtNotification = sendMail({
to: 'k.hutten@protonmail.ch',
from: {
address: 'news@mail.cadhub.xyz',
name: 'CadHub',
},
subject: `New Cadhub User`,
text: JSON.stringify(input, null, 2),
})
const welcomeMsg = sendMail({
to: email,
from: {
address: 'news@mail.cadhub.xyz',
name: 'CadHub',
},
subject: `${name} - Some things you should know about CadHub`,
text: `Hi, My name's Kurt.
I started CadHub because I wanted a community hub for people who like CodeCAD as much of I do, you should know that the development of CadHub is very much a community effort as well and if you want get involved the discord is the best place to start https://discord.gg/SD7zFRNjGH.
Long term I hope that CadHub will help push CodeCad as a paradigm forward, as there are clear benefits such as: CI/CD for parts, GIT based workflow and CodeCAD parts are normally much more robust to changes to parametric variables because the author can add logic to accommodate big changes where as GUI-CAD usually relies on blackbox heuristics and is more brittle. Sorry I'm getting into the weeds, if you want to read more on the paradigm see our blog https://learn.cadhub.xyz/.
One very easy way to help out is to give the repo a star (https://github.com/Irev-Dev/cadhub), or simply add any OpenSCAD or CadQuery models you have to the website, building out the library of parts atm is very important.
Hit me up anytime for questions or concerns.
Cheers,
Kurt.
k.hutten@protonmail.ch
https://twitter.com/IrevDev
irevdev#1888 - discord
`,
})
try {
await Promise.all([kurtNotification, welcomeMsg])
} catch (e) {
console.log('Problem sending emails', e)
}
return {
statusCode: 200,
@@ -125,5 +82,3 @@ irevdev#1888 - discord
}
}
}
export const handler = sentryWrapper(unWrappedHandler)

View File

@@ -1,3 +0,0 @@
import { stl, preview } from 'src/docker/openscad/openscad'
export { stl, preview }

View File

@@ -1,41 +0,0 @@
export const schema = gql`
type ProjectReaction {
id: String!
emote: String!
user: User!
userId: String!
project: Project!
projectId: String!
createdAt: DateTime!
updatedAt: DateTime!
}
type Query {
projectReactions: [ProjectReaction!]! @skipAuth
projectReaction(id: String!): ProjectReaction @skipAuth
projectReactionsByProjectId(projectId: String!): [ProjectReaction!]!
@skipAuth
}
input ToggleProjectReactionInput {
emote: String!
userId: String!
projectId: String!
}
input UpdateProjectReactionInput {
emote: String
userId: String
projectId: String
}
type Mutation {
toggleProjectReaction(input: ToggleProjectReactionInput!): ProjectReaction!
@requireAuth
updateProjectReaction(
id: String!
input: UpdateProjectReactionInput!
): ProjectReaction! @requireAuth
deleteProjectReaction(id: String!): ProjectReaction! @requireAuth
}
`

View File

@@ -4,33 +4,32 @@ export const schema = gql`
text: String!
user: User!
userId: String!
project: Project!
projectId: String!
part: Part!
partId: String!
createdAt: DateTime!
updatedAt: DateTime!
}
type Query {
comments: [Comment!]! @skipAuth
comment(id: String!): Comment @skipAuth
comments: [Comment!]!
comment(id: String!): Comment
}
input CreateCommentInput {
text: String!
userId: String!
projectId: String!
partId: String!
}
input UpdateCommentInput {
text: String
userId: String
projectId: String
partId: String
}
type Mutation {
createComment(input: CreateCommentInput!): Comment! @requireAuth
createComment(input: CreateCommentInput!): Comment!
updateComment(id: String!, input: UpdateCommentInput!): Comment!
@requireAuth
deleteComment(id: String!): Comment! @requireAuth
deleteComment(id: String!): Comment!
}
`

View File

@@ -1,20 +0,0 @@
export const schema = gql`
type Envelope {
from: String
to: [String!]!
}
type EmailResponse {
accepted: [String!]!
rejected: [String!]!
}
input Email {
subject: String!
body: String!
}
type Mutation {
sendAllUsersEmail(input: Email!): EmailResponse! @requireAuth
}
`

View File

@@ -0,0 +1,39 @@
export const schema = gql`
type PartReaction {
id: String!
emote: String!
user: User!
userId: String!
part: Part!
partId: String!
createdAt: DateTime!
updatedAt: DateTime!
}
type Query {
partReactions: [PartReaction!]!
partReaction(id: String!): PartReaction
partReactionsByPartId(partId: String!): [PartReaction!]!
}
input TogglePartReactionInput {
emote: String!
userId: String!
partId: String!
}
input UpdatePartReactionInput {
emote: String
userId: String
partId: String
}
type Mutation {
togglePartReaction(input: TogglePartReactionInput!): PartReaction!
updatePartReaction(
id: String!
input: UpdatePartReactionInput!
): PartReaction!
deletePartReaction(id: String!): PartReaction!
}
`

View File

@@ -0,0 +1,45 @@
export const schema = gql`
type Part {
id: String!
title: String!
description: String
code: String
mainImage: String
createdAt: DateTime!
updatedAt: DateTime!
deleted: Boolean!
user: User!
userId: String!
Comment: [Comment]!
Reaction(userId: String): [PartReaction]!
}
type Query {
parts(userName: String): [Part!]!
part(id: String!): Part
partByUserAndTitle(userName: String!, partTitle: String!): Part
}
input CreatePartInput {
title: String!
description: String
code: String
mainImage: String
userId: String!
}
input UpdatePartInput {
title: String
description: String
code: String
mainImage: String
userId: String
}
type Mutation {
createPart(input: CreatePartInput!): Part!
forkPart(input: CreatePartInput!): Part!
updatePart(id: String!, input: UpdatePartInput!): Part!
deletePart(id: String!): Part!
}
`

View File

@@ -1,70 +0,0 @@
export const schema = gql`
type Project {
id: String!
title: String!
description: String
code: String
mainImage: String
createdAt: DateTime!
updatedAt: DateTime!
user: User!
userId: String!
deleted: Boolean!
cadPackage: CadPackage!
socialCard: SocialCard
Comment: [Comment]!
Reaction(userId: String): [ProjectReaction]!
forkedFromId: String
forkedFrom: Project
childForks: [Project]!
}
enum CadPackage {
openscad
cadquery
jscad
}
type Query {
projects(userName: String): [Project!]! @skipAuth
project(id: String!): Project @skipAuth
projectByUserAndTitle(userName: String!, projectTitle: String!): Project
@skipAuth
}
input CreateProjectInput {
title: String
description: String
code: String
mainImage: String
userId: String!
cadPackage: CadPackage!
}
input ForkProjectInput {
userId: String!
forkedFromId: String
code: String
}
input UpdateProjectInput {
title: String
description: String
code: String
mainImage: String
userId: String
}
type Mutation {
createProject(input: CreateProjectInput!): Project! @requireAuth
forkProject(input: ForkProjectInput!): Project! @requireAuth
updateProject(id: String!, input: UpdateProjectInput!): Project!
@requireAuth
updateProjectImages(
id: String!
mainImage64: String
socialCard64: String
): Project! @requireAuth
deleteProject(id: String!): Project! @requireAuth
}
`

View File

@@ -1,16 +0,0 @@
export const schema = gql`
type SocialCard {
id: String!
projectId: String!
project: Project!
createdAt: DateTime!
updatedAt: DateTime!
url: String
outOfDate: Boolean!
}
type Query {
socialCards: [SocialCard!]! @skipAuth
socialCard(id: String!): SocialCard @skipAuth
}
`

View File

@@ -10,8 +10,8 @@ export const schema = gql`
}
type Query {
subjectAccessRequests: [SubjectAccessRequest!]! @requireAuth
subjectAccessRequest(id: String!): SubjectAccessRequest @requireAuth
subjectAccessRequests: [SubjectAccessRequest!]!
subjectAccessRequest(id: String!): SubjectAccessRequest
}
input CreateSubjectAccessRequestInput {
@@ -29,11 +29,11 @@ export const schema = gql`
type Mutation {
createSubjectAccessRequest(
input: CreateSubjectAccessRequestInput!
): SubjectAccessRequest! @requireAuth
): SubjectAccessRequest!
updateSubjectAccessRequest(
id: String!
input: UpdateSubjectAccessRequestInput!
): SubjectAccessRequest! @requireAuth
deleteSubjectAccessRequest(id: String!): SubjectAccessRequest! @requireAuth
): SubjectAccessRequest!
deleteSubjectAccessRequest(id: String!): SubjectAccessRequest!
}
`

View File

@@ -8,17 +8,17 @@ export const schema = gql`
updatedAt: DateTime!
image: String
bio: String
Projects: [Project]!
Project(projectTitle: String): Project
Reaction: [ProjectReaction]!
Parts: [Part]!
Part(partTitle: String): Part
Reaction: [PartReaction]!
Comment: [Comment]!
SubjectAccessRequest: [SubjectAccessRequest]!
}
type Query {
users: [User!]! @requireAuth
user(id: String!): User @skipAuth
userName(userName: String!): User @skipAuth
users: [User!]!
user(id: String!): User
userName(userName: String!): User
}
input CreateUserInput {
@@ -38,10 +38,9 @@ export const schema = gql`
}
type Mutation {
createUser(input: CreateUserInput!): User! @requireAuth
updateUser(id: String!, input: UpdateUserInput!): User! @requireAuth
createUser(input: CreateUserInput!): User!
updateUser(id: String!, input: UpdateUserInput!): User!
updateUserByUserName(userName: String!, input: UpdateUserInput!): User!
@requireAuth
deleteUser(id: String!): User! @requireAuth
deleteUser(id: String!): User!
}
`

View File

@@ -1,5 +1,61 @@
import { AuthenticationError, ForbiddenError } from '@redwoodjs/graphql-server'
import { parseJWT } from '@redwoodjs/api'
// Define what you want `currentUser` to return throughout your app. For example,
// to return a real user from your database, you could do something like:
//
// export const getCurrentUser = async ({ email }) => {
// return await db.user.findUnique({ where: { email } })
// }
//
// If you want to enforce role-based access ...
//
// You'll need to set the currentUser's roles attributes to the
// collection of roles as defined by your app.
//
// This allows requireAuth() on the api side and hasRole() in the useAuth() hook on the web side
// to check if the user is assigned a given role or not.
//
// How you set the currentUser's roles depends on your auth provider and its implementation.
//
// For example, your decoded JWT may store `roles` in it namespaced `app_metadata`:
//
// {
// 'https://example.com/app_metadata': { authorization: { roles: ['admin'] } },
// 'https://example.com/user_metadata': {},
// iss: 'https://app.us.auth0.com/',
// sub: 'email|1234',
// aud: [
// 'https://example.com',
// 'https://app.us.auth0.com/userinfo'
// ],
// iat: 1596481520,
// exp: 1596567920,
// azp: '1l0w6JXXXXL880T',
// scope: 'openid profile email'
// }
//
// The parseJWT utility will extract the roles from decoded token.
//
// The app_medata claim may or may not be namespaced based on the auth provider.
// Note: Auth0 requires namespacing custom JWT claims
//
// Some providers, such as with Auth0, will set roles an authorization
// attribute in app_metadata (namespaced or not):
//
// 'app_metadata': { authorization: { roles: ['publisher'] } }
// 'https://example.com/app_metadata': { authorization: { roles: ['publisher'] } }
//
// Other providers may include roles simply within app_metadata:
//
// 'app_metadata': { roles: ['author'] }
// 'https://example.com/app_metadata': { roles: ['author'] }
//
// And yet other may define roles as a custom claim at the root of the decoded token:
//
// roles: ['admin']
//
// The function `getCurrentUser` should return the user information
// together with a collection of roles to check for role assignment:
import { AuthenticationError, ForbiddenError, parseJWT } from '@redwoodjs/api'
/**
* Use requireAuth in your services to check that a user is logged in,
@@ -41,24 +97,8 @@ import { parseJWT } from '@redwoodjs/api'
* }
* }
*/
export const getCurrentUser = async (
decoded,
{ _token, _type },
{ _event, _context }
) => {
if (!decoded) {
// if no decoded, then never set currentUser
return null
}
const { roles } = parseJWT({ decoded }) // extract and check roles separately
if (roles) {
return { ...decoded, roles }
}
return { ...decoded } // only return when certain you have
// the currentUser properties
export const getCurrentUser = async (decoded, { _token, _type }) => {
return { ...decoded, roles: parseJWT({ decoded }).roles }
}
/**
@@ -81,8 +121,7 @@ export const getCurrentUser = async (
* requireAuth({ role: ['editor', 'author'] })
* requireAuth({ role: ['publisher'] })
*/
export const requireAuth = ({ role }: { role?: string | string[] } = {}) => {
console.log(context.currentUser)
export const requireAuth = ({ role } = {}) => {
if (!context.currentUser) {
throw new AuthenticationError("You don't have permission to do that.")
}

View File

@@ -1,45 +0,0 @@
import {
v2 as cloudinary,
UploadApiResponse,
UpdateApiOptions,
} from 'cloudinary'
cloudinary.config({
cloud_name: 'irevdev',
api_key: process.env.CLOUDINARY_API_KEY,
api_secret: process.env.CLOUDINARY_API_SECRET,
})
interface UploadImageArgs {
image64: string
uploadPreset?: string
publicId?: string
invalidate: boolean
}
export const uploadImage = async ({
image64,
uploadPreset = 'CadHub_project_images',
publicId,
invalidate = true,
}: UploadImageArgs): Promise<UploadApiResponse> => {
const options: UpdateApiOptions = { upload_preset: uploadPreset, invalidate }
if (publicId) {
options.public_id = publicId
}
return new Promise((resolve, reject) => {
cloudinary.uploader.upload(image64, options, (error, result) => {
if (error) {
reject(error)
return
}
resolve(result)
})
})
}
export const makeSocialPublicIdServer = (
userName: string,
projectTitle: string
): string => `u-${userName}-slash-p-${projectTitle}`

44
app/api/src/lib/owner.js Normal file
View File

@@ -0,0 +1,44 @@
import { AuthenticationError, ForbiddenError } from '@redwoodjs/api'
import { db } from 'src/lib/db'
export const requireOwnership = async ({ userId, userName, partId } = {}) => {
// IMPORTANT, don't forget to await this function, as it will only block
// unwanted db actions if it has time to look up resources in the db.
if (!context.currentUser) {
throw new AuthenticationError("You don't have permission to do that.")
}
if (!userId && !userName && !partId) {
throw new ForbiddenError("You don't have access to do that.")
}
if (context.currentUser.roles?.includes('admin')) {
return
}
const netlifyUserId = context.currentUser?.sub
if (userId && userId !== netlifyUserId) {
throw new ForbiddenError("You don't own this resource.")
}
if (userName) {
const user = await db.user.findUnique({
where: { userName },
})
if (!user || user.id !== netlifyUserId) {
throw new ForbiddenError("You don't own this resource.")
}
}
if (partId) {
const user = await db.part
.findUnique({
where: { id: partId },
})
.user()
if (!user || user.id !== netlifyUserId) {
throw new ForbiddenError("You don't own this resource.")
}
}
}

View File

@@ -1,94 +0,0 @@
import { AuthenticationError, ForbiddenError } from '@redwoodjs/graphql-server'
import type { Project } from '@prisma/client'
import { db } from 'src/lib/db'
export const requireOwnership = async ({
userId,
userName,
projectId,
sub,
}: {
userId?: string
userName?: string
projectId?: string
sub?: string
} = {}) => {
// IMPORTANT, don't forget to await this function, as it will only block
// unwanted db actions if it has time to look up resources in the db.
if (!(context?.currentUser || sub)) {
throw new AuthenticationError("You don't have permission to do that.")
}
if (!userId && !userName && !projectId) {
throw new ForbiddenError("You don't have access to do that.")
}
if (context.currentUser.roles?.includes('admin')) {
if (context.currentUser?.sub === '5cea3906-1e8e-4673-8f0d-89e6a963c096') {
throw new ForbiddenError("That's a local admin ONLY.")
}
return
}
const netlifyUserId = context?.currentUser?.sub || sub
if (userId && userId !== netlifyUserId) {
throw new ForbiddenError("You don't own this resource.")
}
if (userName) {
const user = await db.user.findUnique({
where: { userName },
})
if (!user || user.id !== netlifyUserId) {
throw new ForbiddenError("You don't own this resource.")
}
}
if (projectId) {
const user = await db.project
.findUnique({
where: { id: projectId },
})
.user()
if (!user || user.id !== netlifyUserId) {
throw new ForbiddenError("You don't own this resource.")
}
}
}
export const requireProjectOwnership = async ({
projectId,
}: {
userId?: string
userName?: string
projectId?: string
sub?: string
} = {}): Promise<Project> => {
// IMPORTANT, don't forget to await this function, as it will only block
// unwanted db actions if it has time to look up resources in the db.
if (!context?.currentUser) {
throw new AuthenticationError("You don't have permission to do that.")
}
if (!projectId) {
throw new ForbiddenError("You don't have access to do that.")
}
const netlifyUserId = context?.currentUser?.sub
if (projectId || context.currentUser.roles?.includes('admin')) {
if (context.currentUser?.sub === '5cea3906-1e8e-4673-8f0d-89e6a963c096') {
throw new ForbiddenError("That's a local admin ONLY.")
}
const project = await db.project.findUnique({
where: { id: projectId },
})
const hasPermission =
(project && project?.userId === netlifyUserId) ||
context.currentUser.roles?.includes('admin')
if (!hasPermission) {
throw new ForbiddenError("You don't own this resource.")
}
return project
}
}

View File

@@ -1,63 +0,0 @@
import nodemailer, { SendMailOptions } from 'nodemailer'
export interface SendMailArgs {
to: string
from: SendMailOptions['from']
subject: string
text: string
}
interface SuccessResult {
accepted: string[]
rejected: string[]
envelopeTime: number
messageTime: number
messageSize: number
response: string
envelope: {
from: string | false
to: string[]
}
messageId: string
}
export function sendMail({
to,
from,
subject,
text,
}: SendMailArgs): Promise<SuccessResult> {
const transporter = nodemailer.createTransport({
host: 'smtp.mailgun.org',
port: 587,
secure: false,
tls: {
ciphers: 'SSLv3',
},
auth: {
user: 'postmaster@mail.cadhub.xyz',
pass: process.env.EMAIL_PASSWORD,
},
})
console.log({ to, from, subject, text })
const emailPromise = new Promise((resolve, reject) => {
transporter.sendMail(
{
from,
to,
subject,
text,
},
(error, info) => {
if (error) {
reject(error)
} else {
resolve(info)
}
}
)
}) as any as Promise<SuccessResult>
return emailPromise
}

View File

@@ -1,105 +0,0 @@
import { Config, ApolloError } from '@redwoodjs/graphql-server'
import * as Sentry from '@sentry/node'
let sentryInitialized = false
if (process.env.SENTRY_DSN && !sentryInitialized) {
Sentry.init({
dsn: process.env.SENTRY_DSN,
environment: process.env.CONTEXT,
release: process.env.COMMIT_REF,
})
sentryInitialized = true
}
async function reportError(error) {
if (!sentryInitialized) return
// If you do have authentication set up, we can add
// some user data to help debug issues
// if (context.currentUser) {
// Sentry.configureScope((scope) => {
// scope.setUser({
// id: context?.currentUser?.id,
// email: context?.currentUser?.email,
// })
// })
// }
if (typeof error === 'string') {
Sentry.captureMessage(error)
} else {
Sentry.captureException(error)
}
await Sentry.flush()
}
export const sentryWrapper = (handler) => async (event, lambdaContext) => {
lambdaContext.callbackWaitsForEmptyEventLoop = false
try {
return await new Promise((resolve, reject) => {
const callback = (err, result) => {
if (err) {
reject(err)
} else {
resolve(result)
}
}
const resp = handler(event, lambdaContext, callback)
if (resp?.then) {
resp.then(resolve, reject)
}
})
} catch (e) {
// This catches both sync errors & promise
// rejections, because we 'await' on the handler
await reportError(e)
throw e
}
}
export const createSentryApolloPlugin: Config['plugins'][number] = () => ({
requestDidStart: () => {
return {
didEncounterErrors(ctx) {
// If we couldn't parse the operation, don't
// do anything here
if (!ctx.operation) {
return
}
for (const err of ctx.errors) {
// Only report internal server errors,
// all errors extending ApolloError should be user-facing
if (err instanceof ApolloError) {
continue
}
// Add scoped report details and send to Sentry
Sentry.withScope((scope) => {
// Annotate whether failing operation was query/mutation/subscription
scope.setTag('kind', ctx.operation.operation)
// Log query and variables as extras (make sure to strip out sensitive data!)
scope.setExtra('query', ctx.request.query)
scope.setExtra('variables', ctx.request.variables)
if (err.path) {
// We can also add the path as breadcrumb
scope.addBreadcrumb({
category: 'query-path',
message: err.path.join(' > '),
level: Sentry.Severity.Debug,
})
}
const transactionId =
ctx.request.http.headers.get('x-transaction-id')
if (transactionId) {
scope.setTransaction(transactionId)
}
Sentry.captureException(err)
})
}
},
}
},
})

View File

@@ -33,6 +33,6 @@ export const deleteComment = ({ id }) => {
export const Comment = {
user: (_obj, { root }) =>
db.comment.findUnique({ where: { id: root.id } }).user(),
project: (_obj, { root }) =>
db.comment.findUnique({ where: { id: root.id } }).project(),
part: (_obj, { root }) =>
db.comment.findUnique({ where: { id: root.id } }).part(),
}

View File

@@ -0,0 +1,9 @@
/*
import { comments } from './comments'
*/
describe('comments', () => {
it('returns true', () => {
expect(true).toBe(true)
})
})

View File

@@ -1,45 +0,0 @@
import { requireAuth } from 'src/lib/auth'
import { sendMail } from 'src/lib/sendmail'
import type { SendMailArgs } from 'src/lib/sendmail'
import { users } from 'src/services/users/users'
export const sendAllUsersEmail = async ({ input: { body, subject } }) => {
requireAuth({ role: 'admin' })
const from = {
address: 'news@mail.cadhub.xyz',
name: 'CadHub',
}
const emails: SendMailArgs[] = (await users()).map(({ email }) => ({
to: email,
from,
subject,
text: body,
}))
const emailPromises = emails.map((email) => sendMail(email))
const accepted = []
const rejected = []
const result = await Promise.allSettled(emailPromises)
result.forEach((result) => {
if (result.status === 'fulfilled') {
accepted.push(result.value.accepted[0])
} else {
rejected.push(result.reason)
}
})
await sendMail({
to: 'k.hutten@protonmail.ch',
from,
subject: `All users email report`,
text: JSON.stringify(
{
accepted,
rejected,
originalEmailList: emails,
},
null,
2
),
})
return { accepted, rejected }
}

View File

@@ -1,6 +1,4 @@
import { v2 as cloudinary } from 'cloudinary'
import humanId from 'human-id'
cloudinary.config({
cloud_name: 'irevdev',
api_key: process.env.CLOUDINARY_API_KEY,
@@ -22,7 +20,7 @@ export const foreignKeyReplacement = (input) => {
}
export const enforceAlphaNumeric = (string) =>
(string || '').replace(/([^a-zA-Z\d_:])/g, '-')
string.replace(/([^a-zA-Z\d_:])/g, '-')
export const generateUniqueString = async (
seed,
@@ -38,26 +36,6 @@ export const generateUniqueString = async (
return generateUniqueString(newSeed, isUniqueCallback, count)
}
export const generateUniqueStringWithoutSeed = async (
isUniqueCallback: (seed: string) => Promise<boolean>,
count = 0
) => {
const seed = humanId({
separator: '-',
capitalize: false,
})
const isUnique = !(await isUniqueCallback(seed))
if (isUnique) {
return seed
}
count += 1
if (count > 100) {
console.log('trouble finding unique')
return `very-unique-${seed}`.slice(0, 10)
}
return generateUniqueStringWithoutSeed(isUniqueCallback, count)
}
export const destroyImage = ({ publicId }) =>
new Promise((resolve, reject) => {
cloudinary.uploader.destroy(publicId, (error, result) => {

View File

@@ -1,28 +1,28 @@
import { UserInputError } from '@redwoodjs/graphql-server'
import { UserInputError } from '@redwoodjs/api'
import { requireAuth } from 'src/lib/auth'
import { requireOwnership } from 'src/lib/owner'
import { db } from 'src/lib/db'
import { foreignKeyReplacement } from 'src/services/helpers'
export const projectReactions = () => {
return db.projectReaction.findMany()
export const partReactions = () => {
return db.partReaction.findMany()
}
export const projectReaction = ({ id }) => {
return db.projectReaction.findUnique({
export const partReaction = ({ id }) => {
return db.partReaction.findUnique({
where: { id },
})
}
export const projectReactionsByProjectId = ({ projectId }) => {
return db.projectReaction.findMany({
where: { projectId },
export const partReactionsByPartId = ({ partId }) => {
return db.partReaction.findMany({
where: { partId: partId },
})
}
export const toggleProjectReaction = async ({ input }) => {
// if write fails emote_userId_projectId @@unique constraint, then delete it instead
export const togglePartReaction = async ({ input }) => {
// if write fails emote_userId_partId @@unique constraint, then delete it instead
requireAuth()
await requireOwnership({ userId: input?.userId })
const legalReactions = ['❤️', '👍', '😄', '🙌'] // TODO figure out a way of sharing code between FE and BE, so this is consistent with web/src/components/EmojiReaction/EmojiReaction.js
@@ -36,33 +36,33 @@ export const toggleProjectReaction = async ({ input }) => {
let dbPromise
const inputClone = { ...input } // TODO foreignKeyReplacement mutates input, which I should fix but am lazy right now
try {
dbPromise = await db.projectReaction.create({
dbPromise = await db.partReaction.create({
data: foreignKeyReplacement(input),
})
} catch (e) {
dbPromise = db.projectReaction.delete({
where: { emote_userId_projectId: inputClone },
dbPromise = db.partReaction.delete({
where: { emote_userId_partId: inputClone },
})
}
return dbPromise
}
export const updateProjectReaction = ({ id, input }) => {
return db.projectReaction.update({
export const updatePartReaction = ({ id, input }) => {
return db.partReaction.update({
data: foreignKeyReplacement(input),
where: { id },
})
}
export const deleteProjectReaction = ({ id }) => {
return db.projectReaction.delete({
export const deletePartReaction = ({ id }) => {
return db.partReaction.delete({
where: { id },
})
}
export const ProjectReaction = {
export const PartReaction = {
user: (_obj, { root }) =>
db.projectReaction.findUnique({ where: { id: root.id } }).user(),
project: (_obj, { root }) =>
db.projectReaction.findUnique({ where: { id: root.id } }).project(),
db.partReaction.findUnique({ where: { id: root.id } }).user(),
part: (_obj, { root }) =>
db.partReaction.findUnique({ where: { id: root.id } }).part(),
}

View File

@@ -0,0 +1,9 @@
/*
import { partReactions } from './partReactions'
*/
describe('partReactions', () => {
it('returns true', () => {
expect(true).toBe(true)
})
})

View File

@@ -0,0 +1,113 @@
import { db } from 'src/lib/db'
import {
foreignKeyReplacement,
enforceAlphaNumeric,
generateUniqueString,
destroyImage,
} from 'src/services/helpers'
import { requireAuth } from 'src/lib/auth'
import { requireOwnership } from 'src/lib/owner'
export const parts = ({ userName }) => {
if (!userName) {
return db.part.findMany({ where: { deleted: false } })
}
return db.part.findMany({
where: {
deleted: false,
user: {
userName,
},
},
})
}
export const part = ({ id }) => {
return db.part.findUnique({
where: { id },
})
}
export const partByUserAndTitle = async ({ userName, partTitle }) => {
const user = await db.user.findUnique({
where: {
userName,
},
})
return db.part.findUnique({
where: {
title_userId: {
title: partTitle,
userId: user.id,
},
},
})
}
export const createPart = async ({ input }) => {
requireAuth()
return db.part.create({
data: foreignKeyReplacement(input),
})
}
export const forkPart = async ({ input }) => {
// Only difference between create and fork part is that fork part will generate a unique title
// (for the user) if there is a conflict
const isUniqueCallback = async (seed) =>
db.part.findUnique({
where: {
title_userId: {
title: seed,
userId: input.userId,
},
},
})
const title = await generateUniqueString(input.title, isUniqueCallback)
// TODO change the description to `forked from userName/partName ${rest of description}`
return db.part.create({
data: foreignKeyReplacement({ ...input, title }),
})
}
export const updatePart = async ({ id, input }) => {
requireAuth()
await requireOwnership({ partId: id })
if (input.title) {
input.title = enforceAlphaNumeric(input.title)
}
const originalPart = await db.part.findUnique({ where: { id } })
const imageToDestroy =
originalPart.mainImage !== input.mainImage && originalPart.mainImage
const update = await db.part.update({
data: foreignKeyReplacement(input),
where: { id },
})
if (imageToDestroy) {
console.log(`image destroyed, publicId: ${imageToDestroy}, partId: ${id}`)
// destroy after the db has been updated
destroyImage({ publicId: imageToDestroy })
}
return update
}
export const deletePart = async ({ id }) => {
requireAuth()
await requireOwnership({ partId: id })
return db.part.update({
data: {
deleted: true,
},
where: { id },
})
}
export const Part = {
user: (_obj, { root }) =>
db.part.findUnique({ where: { id: root.id } }).user(),
Comment: (_obj, { root }) =>
db.part.findUnique({ where: { id: root.id } }).Comment(),
Reaction: (_obj, { root }) =>
db.part
.findUnique({ where: { id: root.id } })
.Reaction({ where: { userId: _obj.userId } }),
}

View File

@@ -0,0 +1,9 @@
/*
import { parts } from './parts'
*/
describe('parts', () => {
it('returns true', () => {
expect(true).toBe(true)
})
})

View File

@@ -1,294 +0,0 @@
import { ResolverArgs } from '@redwoodjs/graphql-server'
import type { Prisma, Project as ProjectType } from '@prisma/client'
import { uploadImage, makeSocialPublicIdServer } from 'src/lib/cloudinary'
import { db } from 'src/lib/db'
import {
foreignKeyReplacement,
enforceAlphaNumeric,
generateUniqueString,
generateUniqueStringWithoutSeed,
destroyImage,
} from 'src/services/helpers'
import { requireAuth } from 'src/lib/auth'
import { requireOwnership, requireProjectOwnership } from 'src/lib/owner'
export const projects = ({ userName }) => {
if (!userName) {
return db.project.findMany({ where: { deleted: false } })
}
return db.project.findMany({
where: {
deleted: false,
user: {
userName,
},
},
})
}
export const project = ({ id }: Prisma.ProjectWhereUniqueInput) => {
return db.project.findUnique({
where: { id },
})
}
export const projectByUserAndTitle = async ({ userName, projectTitle }) => {
const user = await db.user.findUnique({
where: {
userName,
},
})
return db.project.findUnique({
where: {
title_userId: {
title: projectTitle,
userId: user.id,
},
},
})
}
const isUniqueProjectTitle =
(userId: string) =>
async (seed: string): Promise<boolean> =>
!!(await db.project.findUnique({
where: {
title_userId: {
title: seed,
userId,
},
},
}))
interface CreateProjectArgs {
input: Prisma.ProjectCreateArgs['data']
}
export const createProject = async ({ input }: CreateProjectArgs) => {
requireAuth()
console.log(input.userId)
const isUniqueCallback = isUniqueProjectTitle(input.userId)
let title = input.title
if (!title) {
title = await generateUniqueStringWithoutSeed(isUniqueCallback)
}
return db.project.create({
data: foreignKeyReplacement({
...input,
title,
}),
})
}
export const forkProject = async ({ input }) => {
requireAuth()
const projectData = await db.project.findUnique({
where: {
id: input.forkedFromId,
},
})
const isUniqueCallback = isUniqueProjectTitle(input.userId)
let title = projectData.title
title = await generateUniqueString(title, isUniqueCallback)
const { code, description, cadPackage } = projectData
return db.project.create({
data: foreignKeyReplacement({
...input,
title,
code: input.code || code,
description,
cadPackage,
}),
})
}
interface UpdateProjectArgs extends Prisma.ProjectWhereUniqueInput {
input: Prisma.ProjectUpdateInput
}
export const updateProject = async ({ id, input }: UpdateProjectArgs) => {
const checkSocialCardValidity = async (
projectId: string,
input: UpdateProjectArgs['input'],
oldProject: ProjectType
) => {
const titleChange = input.title && input.title !== oldProject.title
const descriptionChange =
input.description && input.description !== oldProject.description
if (titleChange || descriptionChange) {
const socialCard = await db.socialCard.findUnique({
where: { projectId },
})
if (socialCard) {
return db.socialCard.update({
data: { outOfDate: true },
where: { id: socialCard.id },
})
}
}
}
requireAuth()
const originalProject = await requireProjectOwnership({ projectId: id })
if (input.title) {
input.title = enforceAlphaNumeric(input.title)
}
const socialCardPromise = checkSocialCardValidity(id, input, originalProject)
const imageToDestroy =
originalProject.mainImage !== input.mainImage &&
input.mainImage &&
originalProject.mainImage
const update = await db.project.update({
data: foreignKeyReplacement(input),
where: { id },
})
if (imageToDestroy) {
console.log(
`image destroyed, publicId: ${imageToDestroy}, projectId: ${id}, replacing image is ${input.mainImage}`
)
// destroy after the db has been updated
await destroyImage({ publicId: imageToDestroy })
}
await socialCardPromise
return update
}
export const updateProjectImages = async ({
id,
mainImage64,
socialCard64,
}: {
id: string
mainImage64?: string
socialCard64?: string
}): Promise<ProjectType> => {
requireAuth()
const project = await requireProjectOwnership({ projectId: id })
const replaceSocialCard = async () => {
if (!socialCard64) {
return
}
let publicId = ''
let socialCardId = ''
try {
;({ id: socialCardId, url: publicId } = await db.socialCard.findUnique({
where: { projectId: id },
}))
} catch (e) {
const { userName } = await db.user.findUnique({
where: { id: project.userId },
})
publicId = makeSocialPublicIdServer(userName, project.title)
}
const imagePromise = uploadImage({
image64: socialCard64,
uploadPreset: 'CadHub_project_images',
publicId,
invalidate: true,
})
const saveOrUpdateSocialCard = () => {
const data = {
outOfDate: false,
url: publicId,
}
if (socialCardId) {
return db.socialCard.update({
data,
where: { projectId: id },
})
}
return db.socialCard.create({
data: {
...data,
project: {
connect: {
id: id,
},
},
},
})
}
const socialCardUpdatePromise = saveOrUpdateSocialCard()
const [socialCard] = await Promise.all([
socialCardUpdatePromise,
imagePromise,
])
return socialCard
}
const updateMainImage = async (): Promise<ProjectType> => {
if (!mainImage64) {
return project
}
const { public_id: mainImage } = await uploadImage({
image64: mainImage64,
uploadPreset: 'CadHub_project_images',
invalidate: true,
})
const projectPromise = db.project.update({
data: {
mainImage,
},
where: { id },
})
let imageDestroyPromise = new Promise((r) => r(null))
if (project.mainImage) {
console.log(
`image destroyed, publicId: ${project.mainImage}, projectId: ${id}, replacing image is ${mainImage}`
)
// destroy after the db has been updated
imageDestroyPromise = destroyImage({ publicId: project.mainImage })
}
const [updatedProject] = await Promise.all([
projectPromise,
imageDestroyPromise,
])
return updatedProject
}
const [updatedProject] = await Promise.all([
updateMainImage(),
replaceSocialCard(),
])
return updatedProject
}
export const deleteProject = async ({ id }: Prisma.ProjectWhereUniqueInput) => {
requireAuth()
await requireOwnership({ projectId: id })
const project = await db.project.findUnique({
where: { id },
})
const childrenDeletePromises = [
db.comment.deleteMany({ where: { projectId: project.id } }),
db.projectReaction.deleteMany({ where: { projectId: project.id } }),
db.socialCard.deleteMany({ where: { projectId: project.id } }),
]
await Promise.all(childrenDeletePromises)
await db.project.delete({
where: { id },
})
return project
}
export const Project = {
forkedFrom: (_obj, { root }) =>
root.forkedFromId &&
db.project.findUnique({ where: { id: root.forkedFromId } }),
childForks: (_obj, { root }) =>
db.project.findMany({ where: { forkedFromId: root.id } }),
user: (_obj, { root }: ResolverArgs<ReturnType<typeof project>>) =>
db.user.findUnique({ where: { id: root.userId } }),
socialCard: (_obj, { root }: ResolverArgs<ReturnType<typeof project>>) =>
db.project.findUnique({ where: { id: root.id } }).socialCard(),
Comment: (_obj, { root }: ResolverArgs<ReturnType<typeof project>>) =>
db.project
.findUnique({ where: { id: root.id } })
.Comment({ orderBy: { createdAt: 'desc' } }),
Reaction: (_obj, { root }: ResolverArgs<ReturnType<typeof project>>) =>
db.project
.findUnique({ where: { id: root.id } })
.Reaction({ where: { userId: _obj.userId } }),
}

View File

@@ -1,25 +0,0 @@
import { ResolverArgs, BeforeResolverSpecType } from '@redwoodjs/graphql-server'
import type { Prisma } from '@prisma/client'
import { db } from 'src/lib/db'
import { requireAuth } from 'src/lib/auth'
// Used when the environment variable REDWOOD_SECURE_SERVICES=1
export const beforeResolver = (rules: BeforeResolverSpecType) => {
rules.add(requireAuth)
}
export const socialCards = () => {
return db.socialCard.findMany()
}
export const socialCard = ({ id }: Prisma.SocialCardWhereUniqueInput) => {
return db.socialCard.findUnique({
where: { id },
})
}
export const SocialCard = {
project: (_obj, { root }: ResolverArgs<ReturnType<typeof socialCard>>) =>
db.socialCard.findUnique({ where: { id: root.id } }).project(),
}

View File

@@ -0,0 +1,9 @@
/*
import { subjectAccessRequests } from './subjectAccessRequests'
*/
describe('subjectAccessRequests', () => {
it('returns true', () => {
expect(true).toBe(true)
})
})

View File

@@ -1,30 +1,8 @@
import { UserInputError, ForbiddenError } from '@redwoodjs/graphql-server'
import { db } from 'src/lib/db'
import { requireAuth } from 'src/lib/auth'
import { requireOwnership } from 'src/lib/owner'
import { UserInputError } from '@redwoodjs/api'
import { enforceAlphaNumeric, destroyImage } from 'src/services/helpers'
import type { Prisma } from '@prisma/client'
function userNameVerification(userName: string): string {
if (userName.length < 5) {
throw new ForbiddenError('userName too short')
}
if (userName && ['new', 'edit', 'update'].includes(userName)) {
//TODO complete this and use a regexp so that it's not case sensitive, don't want someone with the userName eDiT
throw new UserInputError(
`You've tried to used a protected word as you userName, try something other than `
)
}
if (userName) {
return enforceAlphaNumeric(userName)
}
}
function nameVerification(name: string) {
if (typeof name === 'string' && name.length < 3) {
throw new ForbiddenError('name too short')
}
}
export const users = () => {
requireAuth({ role: 'admin' })
@@ -47,54 +25,35 @@ export const createUser = ({ input }) => {
requireAuth({ role: 'admin' })
createUserInsecure({ input })
}
export const createUserInsecure = ({
input,
}: {
input: Prisma.UserUncheckedCreateInput
}) => {
if (typeof input.userName === 'string') {
input.userName = userNameVerification(input.userName)
}
nameVerification(input.name)
export const createUserInsecure = ({ input }) => {
return db.user.create({
data: input,
})
}
export const updateUser = ({
id,
input,
}: {
id: string
input: Prisma.UserUncheckedCreateInput
}) => {
export const updateUser = ({ id, input }) => {
requireAuth()
if (typeof input.userName === 'string') {
input.userName = userNameVerification(input.userName)
}
nameVerification(input.name)
return db.user.update({
data: input,
where: { id },
})
}
export const updateUserByUserName = async ({
userName,
input,
}: {
userName: string
input: Prisma.UserUncheckedCreateInput
}) => {
export const updateUserByUserName = async ({ userName, input }) => {
requireAuth()
await requireOwnership({ userName })
if (typeof input.userName === 'string') {
input.userName = userNameVerification(input.userName)
if (input.userName) {
input.userName = enforceAlphaNumeric(input.userName)
}
nameVerification(input.name)
const originalProject = await db.user.findUnique({ where: { userName } })
if (input.userName && ['new', 'edit', 'update'].includes(input.userName)) {
//TODO complete this and use a regexp so that it's not case sensitive, don't want someone with the userName eDiT
throw new UserInputError(
`You've tried to used a protected word as you userName, try something other than `
)
}
const originalPart = await db.user.findUnique({ where: { userName } })
const imageToDestroy =
originalProject.image !== input.image && originalProject.image
originalPart.image !== input.image && originalPart.image
const update = await db.user.update({
data: input,
where: { userName },
@@ -114,14 +73,14 @@ export const deleteUser = ({ id }) => {
}
export const User = {
Projects: (_obj, { root }) =>
db.user.findUnique({ where: { id: root.id } }).Project(),
Project: (_obj, { root }) =>
_obj.projectTitle &&
db.project.findUnique({
Parts: (_obj, { root }) =>
db.user.findUnique({ where: { id: root.id } }).Part(),
Part: (_obj, { root }) =>
_obj.partTitle &&
db.part.findUnique({
where: {
title_userId: {
title: _obj.projectTitle,
title: _obj.partTitle,
userId: root.id,
},
},

View File

@@ -0,0 +1,9 @@
/*
import { users } from './users'
*/
describe('users', () => {
it('returns true', () => {
expect(true).toBe(true)
})
})

View File

@@ -1,17 +0,0 @@
{
"compilerOptions": {
"noEmit": true,
"allowJs": true,
"esModuleInterop": true,
"target": "esnext",
"module": "esnext",
"moduleResolution": "node",
"baseUrl": "./",
"paths": {
"src/*": ["./src/*"]
},
"typeRoots": ["../node_modules/@types", "./node_modules/@types"],
"types": ["jest"]
},
"include": ["src", "../.redwood/**/*"]
}

View File

@@ -1,5 +1,7 @@
const { getPaths } = require('@redwoodjs/internal')
const { getConfig } = require('@redwoodjs/internal')
const config = getConfig()
module.exports = {
schema: getPaths().generated.schema,
schema: `http://${config.api.host}:${config.api.port}/graphql`,
}

View File

@@ -4,16 +4,7 @@ publish = "web/dist"
functions = "api/dist/functions"
[dev]
# To use [Netlify Dev](https://www.netlify.com/products/dev/),
# install netlify-cli from https://docs.netlify.com/cli/get-started/#installation
# and then use netlify link https://docs.netlify.com/cli/get-started/#link-and-unlink-sites
# to connect your local project to a site already on Netlify
# then run netlify dev and our app will be accessible on the port specified below
framework = "redwoodjs"
# Set targetPort to the [web] side port as defined in redwood.toml
targetPort = 8910
# Point your browser to this port to access your RedwoodJS app
port = 8888
command = "yarn rw dev"
[[redirects]]
from = "/*"
@@ -22,10 +13,3 @@ functions = "api/dist/functions"
[context.deploy-preview.environment]
CAD_LAMBDA_BASE_URL = "https://t7wdlz8ztf.execute-api.us-east-1.amazonaws.com/dev2"
[[plugins]]
package = "@sentry/netlify-build-plugin"
[plugins.inputs]
sentryOrg = "kurt"
sentryProject = "kurt"

View File

@@ -6,34 +6,19 @@
"web"
]
},
"scripts": {
"cad": "yarn rw build api && zip-it-and-ship-it api/dist/functions/ api/dist/zipball && docker-compose --file ./api/src/docker/docker-compose.yml up --build",
"cad-r": "yarn rw build api && zip-it-and-ship-it api/dist/functions/ api/dist/zipball && docker-compose --file ./api/src/docker/docker-compose.yml restart",
"aws-emulate": "nodemon ./api/src/docker/aws-emulator.js"
},
"scripts": {},
"devDependencies": {
"@redwoodjs/core": "^0.38.1"
"@redwoodjs/core": "^0.31.0"
},
"eslintConfig": {
"extends": "@redwoodjs/eslint-config",
"rules": {
"react/no-unescaped-entities": [
"error",
{
"forbid": [
">",
"}",
"\""
]
}
]
}
"extends": "@redwoodjs/eslint-config"
},
"engines": {
"node": ">=14.x <=16.x",
"node": ">=14",
"yarn": ">=1.15"
},
"prisma": {
"seed": "yarn rw exec seed"
"resolutions": {
"react": "17.0.1",
"react-dom": "17.0.1"
}
}

View File

@@ -7,27 +7,11 @@
[web]
port = 8910
title = 'CadHub'
# apiUrl = "/.netlify/functions"
apiUrl = "https://uk5gegwopd.execute-api.us-east-2.amazonaws.com/.netlify/functions"
includeEnvironmentVariables = [
'GOOGLE_ANALYTICS_ID',
'CLOUDINARY_API_KEY',
# 'CLOUDINARY_API_SECRET',
'CAD_LAMBDA_BASE_URL',
'SENTRY_DSN',
'SENTRY_AUTH_TOKEN',
'SENTRY_ORG',
'SENTRY_PROJECT',
# 'EMAIL_PASSWORD'
]
apiProxyPath = "/.netlify/functions"
includeEnvironmentVariables = ['GOOGLE_ANALYTICS_ID', 'CLOUDINARY_API_KEY', 'CLOUDINARY_API_SECRET', 'CAD_LAMBDA_BASE_URL']
# experimentalFastRefresh = true # this seems to break cascadeStudio
[api]
port = 8911
schemaPath = "./api/db/schema.prisma"
[browser]
open = true
[experimental]
esbuild = true

View File

@@ -1,235 +0,0 @@
import type { Prisma } from '@prisma/client'
import { db } from '$api/src/lib/db'
export default async () => {
try {
const users = [
{
id: "a2b21ce1-ae57-43a2-b6a3-b6e542fd9e60",
userName: "local-user-1",
name: "local 1",
email: "localUser1@kurthutten.com"
},
{
id: "682ba807-d10e-4caf-bf28-74054e46c9ec",
userName: "local-user-2",
name: "local 2",
email: "localUser2@kurthutten.com"
},
{
id: "5cea3906-1e8e-4673-8f0d-89e6a963c096",
userName: "local-admin-2",
name: "local admin",
email: "localAdmin@kurthutten.com"
},
]
let existing
existing = await db.user.findMany({ where: { id: users[0].id }})
if(!existing.length) {
await db.user.create({
data: users[0],
})
}
existing = await db.user.findMany({ where: { id: users[1].id }})
if(!existing.length) {
await db.user.create({
data: users[1],
})
}
const projects = [
{
title: 'demo-project1',
description: '# can be markdown',
mainImage: 'CadHub/kjdlgjnu0xmwksia7xox',
code: getOpenScadHingeCode(),
cadPackage: 'openscad',
user: {
connect: {
id: users[0].id,
},
},
},
{
title: 'demo-project2',
description: '## [hey](www.google.com)',
user: {
connect: {
id: users[1].id,
},
},
},
]
existing = await db.project.findMany({where: { title: projects[0].title}})
if(!existing.length) {
await db.project.create({
data: projects[0],
})
}
existing = await db.project.findMany({where: { title: projects[1].title}})
if(!existing.length) {
const result = await db.project.create({
data: projects[1],
})
await db.project.create({
data: {
...projects[1],
title: `${projects[1].title}-fork`,
forkedFrom: {
connect: {
id: result.id,
},
},
},
})
}
const aProject = await db.project.findUnique({where: {
title_userId: {
title: projects[0].title,
userId: users[0].id,
}
}})
await db.comment.create({
data: {
text: "nice project, I like it",
userId: users[0].id,
projectId: aProject.id,
// user: {connect: { id: users[0].id}},
// project: {connect: { id: aProject.id}},
}
})
await db.projectReaction.create({
data: {
emote: "❤️",
userId: users[0].id,
projectId: aProject.id,
// user: {connect: { id: users[0].id}},
// project: {connect: { id: aProject.id}},
}
})
} catch (error) {
console.warn('Please define your seed data.')
console.error(error)
}
}
function getOpenScadHingeCode () {
return `
baseWidth=15; // [0.1:0.1:50]
hingeLength=30; // [0.1:0.1:50]
// Hole mant mounting holes per half.
mountingHoleCount=3; // [1:20]
baseThickness=3; // [0.1:0.1:20]
pivotRadius=5; // [0.1:0.1:20]
// Pin that the hinge pivots on.
pinRadius=2; // [0.1:0.1:20]
mountingHoleRadius=1.5; // [0.1:0.1:10]
// How far away the hole is from the egde.
mountingHoleEdgeOffset=4; // [0:50]
// Depending on the accuracy of your printer this may need to be increased in order for print in place to work.
clearance=0.2; // [0.05:0.01:1]
// Radius difference in the ivot taper to stop the hinge from falling apart. Should be increased with large clearance values.
pinTaper=0.25; // [0.1:0.1:2]
// calculated values
hingeHalfExtrudeLength=hingeLength/2-clearance/2;
mountingHoleMoveIncrement=(hingeLength-2*mountingHoleEdgeOffset)/
(mountingHoleCount-1);
module costomizerEnd() {}
$fn=30;
tiny=0.005;
// modules
module hingeBaseProfile() {
translate([pivotRadius,0,0]){
square([baseWidth,baseThickness]);
}
}
module hingeBodyHalf() {
difference() {
union() {
linear_extrude(hingeHalfExtrudeLength){
offset(1)offset(-2)offset(1){
translate([0,pivotRadius,0]){
circle(pivotRadius);
}
square([pivotRadius,pivotRadius]);
hingeBaseProfile();
}
}
linear_extrude(hingeLength){
offset(1)offset(-1)hingeBaseProfile();
}
}
plateHoles();
}
}
module pin(rotateY, radiusOffset) {
translate([0,pivotRadius,hingeHalfExtrudeLength+tiny]){
rotate([0,rotateY,0]) {
cylinder(
h=hingeLength/2+clearance/2,
r1=pinRadius+radiusOffset,
r2=pinRadius+pinTaper+radiusOffset
);
}
}
}
module hingeHalfFemale() {
difference() {
hingeBodyHalf();
pin(rotateY=180, radiusOffset=clearance);
}
}
module hingeHalfMale() {
translate([0,0,hingeLength]) {
rotate([0,180,0]) {
hingeBodyHalf();
pin(rotateY=0, radiusOffset=0);
}
}
}
module plateHoles() {
for(i=[0:mountingHoleCount-1]){
translate([
baseWidth/2+pivotRadius,
-baseThickness,
i*mountingHoleMoveIncrement+mountingHoleEdgeOffset
]){
rotate([-90,0,0]){
cylinder(r=mountingHoleRadius,h=baseThickness*4);
}
}
}
}
// using high-level modules
translate([0,0,-15]) {
hingeHalfFemale();
hingeHalfMale();
}
`
}

View File

@@ -1,188 +0,0 @@
# See the full yml reference at https://www.serverless.com/framework/docs/providers/aws/guide/serverless.yml/
service: cadhubapi
# Uncomment org and app if you want to integrate your deployment with the Serverless dashboard. See https://www.serverless.com/framework/docs/dashboard/ for more details.
# org: your-org
# app: your-app
plugins:
- serverless-dotenv-plugin
- serverless-binary-cors
- serverless-plugin-git-variables
custom:
dotenv:
include:
- DATABASE_URL_PROD
- CLOUDINARY_API_KEY
- CLOUDINARY_API_SECRET
- EMAIL_PASSWORD
- SENTRY_DSN
# - # List the environment variables you want to include from your .env file here.
provider:
name: aws
lambdaHashingVersion: 20201221
runtime: nodejs14.x
region: us-east-2 # This is the AWS region where the service will be deployed.
httpApi: # HTTP API is used by default. To learn about the available options in API Gateway, see https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-vs-rest.html
cors: true
payload: '1.0'
stackTags: # Add CloudFormation stack tags here
source: serverless
name: Redwood Lambda API with HTTP API Gateway
tags: # Add service wide tags here
name: Redwood Lambda API with HTTP API Gateway
ecr:
images:
# this image is built locally and push to ECR
openscadimage:
path: ./
file: api/src/docker/openscad/Dockerfile
cadqueryimage:
path: ./
file: api/src/docker/cadquery/Dockerfile
apiGateway:
metrics: true
binaryMediaTypes:
# we need to allow binary types to be able to send back images and stls, but it would be better to be more specific
# ie image/png etc. as */* treats everything as binary including the json body as the input the lambdas
# which mean we need to decode the input bode from base64, but the images break with anything other than */* :(
- '*/*'
package:
individually: true
functions:
check-user-name:
description: check-user-name function deployed on AWS Lambda
package:
artifact: api/dist/zipball/check-user-name.zip # This is the default location of the zip file generated during the deploy command.
memorySize: 1024 # mb
timeout: 25 # seconds (max: 29)
tags: # Tags for this specific lambda function
endpoint: /.netlify/functions/check-user-name
# Uncomment this section to add environment variables either from the Serverless dotenv plugin or using Serverless params
environment:
SENTRY_DSN: ${env:SENTRY_DSN}
DATABASE_URL: ${env:DATABASE_URL_PROD}
COMMIT_REF: ${git:sha1}
CONTEXT: TODO
handler: check-user-name.handler
events:
- httpApi:
path: /.netlify/functions/check-user-name
method: GET
# cors: true
- httpApi:
path: /.netlify/functions/check-user-name
method: POST
# cors: true
graphql:
description: graphql function deployed on AWS Lambda
package:
artifact: api/dist/zipball/graphql.zip # This is the default location of the zip file generated during the deploy command.
memorySize: 1024 # mb
timeout: 25 # seconds (max: 29)
tags: # Tags for this specific lambda function
endpoint: /.netlify/functions/graphql
# Uncomment this section to add environment variables either from the Serverless dotenv plugin or using Serverless params
environment:
CLOUDINARY_API_KEY: ${env:CLOUDINARY_API_KEY}
CLOUDINARY_API_SECRET: ${env:CLOUDINARY_API_SECRET}
EMAIL_PASSWORD: ${env:EMAIL_PASSWORD}
SENTRY_DSN: ${env:SENTRY_DSN}
DATABASE_URL: ${env:DATABASE_URL_PROD}
COMMIT_REF: ${git:sha1}
CONTEXT: TODO
# YOUR_FIRST_ENV_VARIABLE: ${env:YOUR_FIRST_ENV_VARIABLE}
handler: graphql.handler
events:
- httpApi:
path: /.netlify/functions/graphql
method: GET
# cors: true
- httpApi:
path: /.netlify/functions/graphql
method: POST
# cors: true
# identity-signup: # this is netlify specific and is related to go true auth, so we'll continue having that deployed on netlify
# description: identity-signup function deployed on AWS Lambda
# package:
# artifact: api/dist/zipball/identity-signup.zip # This is the default location of the zip file generated during the deploy command.
# memorySize: 1024 # mb
# timeout: 25 # seconds (max: 29)
# tags: # Tags for this specific lambda function
# endpoint: /.netlify/functions/identity-signup
# # Uncomment this section to add environment variables either from the Serverless dotenv plugin or using Serverless params
# # environment:
# # YOUR_FIRST_ENV_VARIABLE: ${env:YOUR_FIRST_ENV_VARIABLE}
# handler: identity-signup.handler
# events:
# - httpApi:
# path: /.netlify/functions/identity-signup
# method: GET
# - httpApi:
# path: /.netlify/functions/identity-signup
# method: POST
openscadpreview:
image:
name: openscadimage
command:
- openscad.preview
entryPoint:
- '/entrypoint.sh'
events:
- http:
path: openscad/preview
method: post
cors: true
timeout: 25
openscadstl:
image:
name: openscadimage
command:
- openscad.stl
entryPoint:
- '/entrypoint.sh'
events:
- http:
path: openscad/stl
method: post
cors: true
timeout: 30
cadquerystl:
image:
name: cadqueryimage
command:
- cadquery.stl
entryPoint:
- '/entrypoint.sh'
events:
- http:
path: cadquery/stl
method: post
cors: true
timeout: 30
# this allows browsers to see error responses.
resources:
Resources:
GatewayResponseDefault4XX:
Type: 'AWS::ApiGateway::GatewayResponse'
Properties:
ResponseParameters:
gatewayresponse.header.Access-Control-Allow-Origin: "'*'"
gatewayresponse.header.Access-Control-Allow-Headers: "'*'"
ResponseType: DEFAULT_4XX
RestApiId:
Ref: 'ApiGatewayRestApi'
GatewayResponseDefault5XX:
Type: 'AWS::ApiGateway::GatewayResponse'
Properties:
ResponseParameters:
gatewayresponse.header.Access-Control-Allow-Origin: "'*'"
gatewayresponse.header.Access-Control-Allow-Headers: "'*'"
ResponseType: DEFAULT_5XX
RestApiId:
Ref: 'ApiGatewayRestApi'

View File

@@ -2,8 +2,7 @@ const path = require('path')
module.exports = {
plugins: [
require('postcss-import'),
require('tailwindcss')(path.resolve(__dirname, 'tailwind.config.js')),
require('tailwindcss')(path.resolve(__dirname, '../tailwind.config.js')),
require('autoprefixer'),
],
}

View File

@@ -1,13 +1,92 @@
const MonacoWebpackPlugin = require('monaco-editor-webpack-plugin')
module.exports = (config, { env }) => {
config.plugins.forEach((plugin) => {
if (plugin.constructor.name === 'HtmlWebpackPlugin') {
plugin.userOptions.favicon = './src/favicon.svg'
plugin.options.favicon = './src/favicon.svg'
} else if (plugin.constructor.name === 'CopyPlugin') {
plugin.patterns.push({
from: './src/cascade/js/StandardLibraryIntellisense.ts',
to: 'js/StandardLibraryIntellisense.ts',
})
plugin.patterns.push({
from: './src/cascade/static_node_modules/opencascade.js/dist/oc.d.ts',
to: 'opencascade.d.ts',
})
plugin.patterns.push({
from: '../node_modules/three/src/Three.d.ts',
to: 'Three.d.ts',
})
plugin.patterns.push({
from: './src/cascade/fonts',
to: 'fonts',
})
plugin.patterns.push({
from: './src/cascade/textures',
to: 'textures',
})
}
})
config.module.rules.push({
test: /\.(md|jscad\.js|py|scad)$/i,
use: 'raw-loader',
});
config.plugins.push(
new MonacoWebpackPlugin({
languages: ['typescript'],
features: [
'accessibilityHelp',
'anchorSelect',
'bracketMatching',
'caretOperations',
'clipboard',
'codeAction',
'codelens',
'comment',
'contextmenu',
'coreCommands',
'cursorUndo',
'documentSymbols',
'find',
'folding',
'fontZoom',
'format',
'gotoError',
'gotoLine',
'gotoSymbol',
'hover',
'inPlaceReplace',
'indentation',
'inlineHints',
'inspectTokens',
'linesOperations',
'linkedEditing',
'links',
'multicursor',
'parameterHints',
'quickCommand',
'quickHelp',
'quickOutline',
'referenceSearch',
'rename',
'smartSelect',
'snippets',
'suggest',
'toggleHighContrast',
'toggleTabFocusMode',
'transpose',
'unusualLineTerminators',
'viewportSemanticTokens',
'wordHighlighter',
'wordOperations',
'wordPartOperations',
],
})
)
config.module.rules[0].oneOf.push({
test: /opencascade\.wasm\.wasm$/,
type: 'javascript/auto',
loader: 'file-loader',
})
config.node = {
fs: 'empty',
}
return config
}

View File

@@ -1,10 +0,0 @@
declare module "worker-loader!*" {
// You need to change `Worker`, if you specified a different value for the `workerType` option
class WebpackWorker extends Worker {
constructor();
}
// Uncomment this if you set the `esModule` option to `false`
// export = WebpackWorker;
export default WebpackWorker;
}

View File

@@ -1 +1,6 @@
module.exports = require('@redwoodjs/testing/config/jest/api')
const { getConfig } = require('@redwoodjs/core')
const config = getConfig({ type: 'jest', target: 'browser' })
config.displayName.name = 'web'
module.exports = config

View File

@@ -13,53 +13,43 @@
]
},
"dependencies": {
"@headlessui/react": "^1.4.1",
"@heroicons/react": "^1.0.4",
"@headlessui/react": "^1.0.0",
"@material-ui/core": "^4.11.0",
"@monaco-editor/react": "^4.0.11",
"@react-three/drei": "^7.3.1",
"@react-three/fiber": "^7.0.5",
"@react-three/postprocessing": "^2.0.5",
"@redwoodjs/auth": "^0.38.1",
"@redwoodjs/forms": "^0.38.1",
"@redwoodjs/router": "^0.38.1",
"@redwoodjs/web": "^0.38.1",
"@sentry/browser": "^6.5.1",
"@tailwindcss/aspect-ratio": "0.2.1",
"axios": "^0.21.1",
"browser-fs-access": "^0.17.2",
"@redwoodjs/auth": "^0.31.0",
"@redwoodjs/forms": "^0.31.0",
"@redwoodjs/router": "^0.31.0",
"@redwoodjs/web": "^0.31.0",
"cloudinary-react": "^1.6.7",
"controlkit": "^0.1.9",
"get-active-classes": "^0.0.11",
"golden-layout": "^1.5.9",
"gotrue-js": "^0.9.27",
"hotkeys-js": "^3.8.7",
"html-to-image": "^1.7.0",
"lodash": "^4.17.21",
"jquery": "^3.5.1",
"monaco-editor": "^0.20.0",
"monaco-editor-webpack-plugin": "^1.9.1",
"netlify-identity-widget": "^1.9.1",
"pako": "^2.0.3",
"opencascade.js": "^0.1.15",
"prop-types": "^15.7.2",
"react": "^17.0.2",
"react-dom": "^17.0.2",
"react": "^17.0.1",
"react-dom": "^17.0.1",
"react-dropzone": "^11.2.1",
"react-ga": "^3.3.0",
"react-helmet": "^6.1.0",
"react-hotkeys-hook": "^3.4.0",
"react-image-crop": "^8.6.6",
"react-intersection-observer": "^8.32.1",
"react-mosaic-component": "^5.0.0",
"react-tabs": "^3.2.2",
"react-mosaic-component": "^4.1.1",
"react-three-fiber": "^5.3.19",
"rich-markdown-editor": "^11.0.2",
"styled-components": "^5.2.0",
"three": "^0.130.1",
"worker-loader": "^3.0.8"
"three": "^0.118.3"
},
"devDependencies": {
"@types/lodash": "^4.14.170",
"autoprefixer": "^10.3.1",
"postcss": "^8.2.13",
"autoprefixer": "^10.2.5",
"html-webpack-plugin": "^4.5.0",
"postcss": "^8.3.6",
"postcss-import": "^14.0.2",
"postcss-loader": "^6.1.1",
"raw-loader": "^4.0.2",
"tailwindcss": "^2.2.7"
"opentype.js": "^1.3.3",
"postcss-loader": "4.0.2",
"tailwindcss": "^2.1.2",
"worker-loader": "^3.0.7"
}
}
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 102 KiB

Binary file not shown.

Some files were not shown because too many files have changed in this diff Show More