Compare commits

...

42 Commits

Author SHA1 Message Date
Yeicor
6f95a2f3ad Automatically update version to 0.6.16 2024-03-16 09:55:30 +00:00
Yeicor
63c74461b2 Merge remote-tracking branch 'origin/master' 2024-03-16 10:54:38 +01:00
Yeicor
e85dc36fea clean frontend disconnection protocol 2024-03-16 10:54:26 +01:00
dependabot[bot]
43d30b0fdd Bump vite from 5.1.5 to 5.1.6 (#19)
Bumps [vite](https://github.com/vitejs/vite/tree/HEAD/packages/vite) from 5.1.5 to 5.1.6.
- [Release notes](https://github.com/vitejs/vite/releases)
- [Changelog](https://github.com/vitejs/vite/blob/main/packages/vite/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite/commits/v5.1.6/packages/vite)

---
updated-dependencies:
- dependency-name: vite
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-16 09:18:16 +00:00
dependabot[bot]
acba91322c Bump terser from 5.29.1 to 5.29.2 (#18)
Bumps [terser](https://github.com/terser/terser) from 5.29.1 to 5.29.2.
- [Changelog](https://github.com/terser/terser/blob/master/CHANGELOG.md)
- [Commits](https://github.com/terser/terser/compare/v5.29.1...v5.29.2)

---
updated-dependencies:
- dependency-name: terser
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-16 09:18:03 +00:00
dependabot[bot]
ba7ce3727d Bump @types/node from 20.11.25 to 20.11.28 (#16)
Bumps [@types/node](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/node) from 20.11.25 to 20.11.28.
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/node)

---
updated-dependencies:
- dependency-name: "@types/node"
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-16 09:17:45 +00:00
dependabot[bot]
d168806744 Bump vuetify from 3.5.8 to 3.5.9 (#15)
Bumps [vuetify](https://github.com/vuetifyjs/vuetify/tree/HEAD/packages/vuetify) from 3.5.8 to 3.5.9.
- [Release notes](https://github.com/vuetifyjs/vuetify/releases)
- [Commits](https://github.com/vuetifyjs/vuetify/commits/v3.5.9/packages/vuetify)

---
updated-dependencies:
- dependency-name: vuetify
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-16 09:17:27 +00:00
dependabot[bot]
919c05eb9d Bump @gltf-transform/functions from 3.10.0 to 3.10.1 (#14)
Bumps [@gltf-transform/functions](https://github.com/donmccurdy/glTF-Transform) from 3.10.0 to 3.10.1.
- [Changelog](https://github.com/donmccurdy/glTF-Transform/blob/main/CHANGELOG.md)
- [Commits](https://github.com/donmccurdy/glTF-Transform/compare/v3.10.0...v3.10.1)

---
updated-dependencies:
- dependency-name: "@gltf-transform/functions"
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-03-16 09:17:17 +00:00
Yeicor
2370fd72ed Automatically update version to 0.6.15 2024-03-14 16:27:55 +00:00
Yeicor
aef047a658 Merge remote-tracking branch 'origin/master' 2024-03-14 17:27:02 +01:00
Yeicor
d5cdd094e8 reduce idle cpu usage and add todo 2024-03-14 17:26:54 +01:00
Yeicor
9c71573934 Automatically update version to 0.6.14 2024-03-10 18:47:07 +00:00
Yeicor
8fc5ed7544 fix for export_all 2024-03-10 19:46:20 +01:00
Yeicor
1fd932dbc6 Merge remote-tracking branch 'origin/master' 2024-03-10 19:06:59 +01:00
Yeicor
539ac40e3d update readme links 2024-03-10 19:06:51 +01:00
Yeicor
9c2656d7db Automatically update version to 0.6.13 2024-03-10 17:57:32 +00:00
Yeicor
161d76ee69 Merge remote-tracking branch 'origin/master' 2024-03-10 18:56:52 +01:00
Yeicor
431c41a615 fix CI deployment 12 2024-03-10 18:56:45 +01:00
Yeicor
7144eb39da Automatically update version to 0.6.12 2024-03-10 17:41:13 +00:00
Yeicor
8e1c89ad6d fix CI deployment 11 2024-03-10 18:40:13 +01:00
Yeicor
7f692c0b52 Automatically update version to 0.6.11 2024-03-10 17:32:32 +00:00
Yeicor
86043132a8 Merge remote-tracking branch 'origin/master' 2024-03-10 18:32:00 +01:00
Yeicor
23b4d25464 fix CI deployment 10 2024-03-10 18:31:52 +01:00
Yeicor
22514d8603 Automatically update version to 0.6.10 2024-03-10 17:27:00 +00:00
Yeicor
b440a89b13 Merge remote-tracking branch 'origin/master' 2024-03-10 18:26:21 +01:00
Yeicor
cbdb5aff5e fix CI deployment 10 2024-03-10 18:26:13 +01:00
Yeicor
a3a9258a78 Automatically update version to 0.6.9 2024-03-10 17:21:51 +00:00
Yeicor
9f30ac8eb7 Merge remote-tracking branch 'origin/master' 2024-03-10 18:21:08 +01:00
Yeicor
e11c9dd5c6 fix CI deployment 9 2024-03-10 18:21:00 +01:00
Yeicor
520b89af4a Automatically update version to 0.6.8 2024-03-10 17:17:37 +00:00
Yeicor
ba9aef2454 Merge remote-tracking branch 'origin/master' 2024-03-10 18:16:43 +01:00
Yeicor
509b12cd97 fix CI deployment 8 2024-03-10 18:16:34 +01:00
Yeicor
40b4d51895 Automatically update version to 0.6.7 2024-03-10 17:13:50 +00:00
Yeicor
af68f8b1ff fix CI deployment 7 2024-03-10 18:13:05 +01:00
Yeicor
9cb6b29c93 fix CI deployment 6 2024-03-10 18:11:24 +01:00
Yeicor
3174a39ef9 Merge remote-tracking branch 'origin/master' 2024-03-10 18:09:37 +01:00
Yeicor
78231aff31 fix CI deployment 5 2024-03-10 18:09:26 +01:00
Yeicor
39f1231f90 Automatically update version to 0.6.5 2024-03-10 17:08:58 +00:00
Yeicor
ed9251faac fix CI deployment 4 2024-03-10 18:08:13 +01:00
Yeicor
49d0afa616 fix CI deployment 3 2024-03-10 18:05:21 +01:00
Yeicor
844860ee1a fix CI deployment 2 2024-03-10 17:58:06 +01:00
Yeicor
c1ae621e6f fix CI deployment 2024-03-10 17:56:48 +01:00
19 changed files with 434 additions and 238 deletions

View File

@@ -6,6 +6,11 @@ on:
branches: branches:
- "master" - "master"
workflow_call: workflow_call:
inputs:
ref:
type: "string"
required: true
description: "The ref (branch or tag) to build"
jobs: jobs:
@@ -14,6 +19,8 @@ jobs:
runs-on: "ubuntu-latest" runs-on: "ubuntu-latest"
steps: steps:
- uses: "actions/checkout@v4" - uses: "actions/checkout@v4"
with:
ref: "${{ inputs.ref }}"
- uses: "actions/setup-node@v4" - uses: "actions/setup-node@v4"
with: with:
cache: "yarn" cache: "yarn"
@@ -30,6 +37,8 @@ jobs:
runs-on: "ubuntu-latest" runs-on: "ubuntu-latest"
steps: steps:
- uses: "actions/checkout@v4" - uses: "actions/checkout@v4"
with:
ref: "${{ inputs.ref }}"
- run: "pipx install poetry" - run: "pipx install poetry"
- uses: "actions/setup-python@v5" - uses: "actions/setup-python@v5"
with: with:
@@ -43,6 +52,8 @@ jobs:
runs-on: "ubuntu-latest" runs-on: "ubuntu-latest"
steps: steps:
- uses: "actions/checkout@v4" - uses: "actions/checkout@v4"
with:
ref: "${{ inputs.ref }}"
- run: "pipx install poetry" - run: "pipx install poetry"
- uses: "actions/setup-python@v5" - uses: "actions/setup-python@v5"
with: with:
@@ -61,13 +72,15 @@ jobs:
runs-on: "ubuntu-latest" runs-on: "ubuntu-latest"
steps: steps:
- uses: "actions/checkout@v4" - uses: "actions/checkout@v4"
with:
ref: "${{ inputs.ref }}"
- run: "pipx install poetry" - run: "pipx install poetry"
- uses: "actions/setup-python@v5" - uses: "actions/setup-python@v5"
with: with:
python-version: "3.11" python-version: "3.11"
cache: "poetry" cache: "poetry"
- run: "SKIP_BUILD_FRONTEND=true poetry install" - run: "SKIP_BUILD_FRONTEND=true poetry install"
- run: "PYTHONPATH=yacv_server YACV_DISABLE_SERVER=true poetry run python example/object.py" - run: "YACV_DISABLE_SERVER=true poetry run python example/object.py"
- uses: "actions/upload-artifact@v4" - uses: "actions/upload-artifact@v4"
with: with:
name: "example" name: "example"

59
.github/workflows/deploy1.yml vendored Normal file
View File

@@ -0,0 +1,59 @@
on:
push:
tags:
- "v**"
permissions: # Same as deploy2.yml
contents: "write"
pages: "write"
id-token: "write"
jobs:
update-versions:
runs-on: "ubuntu-latest"
steps:
- uses: "actions/checkout@v4"
with: # Ensure we are not in a detached HEAD state
ref: "master"
# Check that the tag commit is the latest master commit
- run: |
git fetch --tags
tag_commit=$(git rev-parse ${{ github.ref }})
master_commit=$(git rev-parse master)
if [ "$tag_commit" != "$master_commit" ]; then
echo "The tag commit ($tag_commit) is not the latest master commit ($master_commit)"
exit 1
fi
- run: "echo 'CLEAN_VERSION=${{ github.ref }}' | sed 's,refs/tags/v,,g' >> $GITHUB_ENV"
# Write the new version to package.json
- uses: "actions/setup-node@v4"
- run: "yarn version --new-version $CLEAN_VERSION --no-git-tag-version"
# Write the new version to pyproject.toml
- run: "pipx install poetry"
- uses: "actions/setup-python@v5"
with:
python-version: "3.11"
cache: "poetry"
- run: "poetry version $CLEAN_VERSION"
# Commit the changes and move the tag!
- run: |
git config --global user.email "yeicor@users.noreply.github.com"
git config --global user.name "Yeicor"
if git commit -am "Automatically update version to $CLEAN_VERSION"; then
git push
# Move the tag to the new commit
git tag -f -a "v$CLEAN_VERSION" -m "v$CLEAN_VERSION"
git push -f --tags # Force push the tag to GitHub
# The tag move will NOT trigger a new workflow
else
echo "No source change detected on version update (did you repeat a release tag??)"
exit 1
fi
deploy: # Makes sure all artifacts are updated and use the new version for the next deployment steps
needs: "update-versions"
uses: "./.github/workflows/deploy2.yml"
secrets: "inherit" # Inherit the secrets from the parent workflow
with:
ref: "master" # Ensure we are cloning the latest version of the repository

View File

@@ -1,7 +1,10 @@
on: on:
push: workflow_call:
tags: inputs:
- "v**" ref:
type: "string"
required: true
description: "The ref (branch or tag) to build"
# Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages # Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages
permissions: permissions:
@@ -17,25 +20,10 @@ concurrency:
jobs: jobs:
update-versions:
runs-on: "ubuntu-latest"
steps:
- uses: "actions/checkout@v4"
- run: "echo 'CLEAN_VERSION=${{ github.ref }}' | sed 's,refs/tags/v,,g' >> $GITHUB_ENV"
# Write the new version to package.json
- uses: "actions/setup-node@v4"
- run: "yarn version --new-version $CLEAN_VERSION"
# Write the new version to pyproject.toml
- run: "pipx install poetry"
- uses: "actions/setup-python@v5"
with:
python-version: "3.11"
cache: "poetry"
- run: "poetry version $CLEAN_VERSION"
rebuild: # Makes sure all artifacts are updated and use the new version rebuild: # Makes sure all artifacts are updated and use the new version
needs: "update-versions"
uses: "./.github/workflows/build.yml" uses: "./.github/workflows/build.yml"
with:
ref: "${{ inputs.ref }}"
deploy-frontend: deploy-frontend:
needs: "rebuild" needs: "rebuild"
@@ -44,24 +32,14 @@ jobs:
name: "github-pages" name: "github-pages"
url: "${{ steps.deployment.outputs.page_url }}" url: "${{ steps.deployment.outputs.page_url }}"
steps: steps:
- uses: "dawidd6/action-download-artifact@v3" - uses: "actions/download-artifact@v4"
with: with: # Downloads all artifacts from the build job
workflow: "build.yml"
name: "frontend"
path: "./public" path: "./public"
allow_forks: false - run: | # Merge the subdirectories of public into a single directory
- uses: "dawidd6/action-download-artifact@v3" for dir in public/*; do
with: mv "$dir/"* public/
workflow: "build.yml" rmdir "$dir"
name: "logo" done
path: "./public"
allow_forks: false
- uses: "dawidd6/action-download-artifact@v3"
with:
workflow: "build.yml"
name: "example"
path: "./public"
allow_forks: false
- uses: "actions/configure-pages@v4" - uses: "actions/configure-pages@v4"
- uses: "actions/upload-pages-artifact@v3" - uses: "actions/upload-pages-artifact@v3"
with: with:
@@ -81,7 +59,17 @@ jobs:
runs-on: "ubuntu-latest" runs-on: "ubuntu-latest"
steps: steps:
- uses: "actions/checkout@v4" - uses: "actions/checkout@v4"
- uses: "JRubics/poetry-publish@v2" with:
ref: "${{ inputs.ref }}"
- uses: "actions/setup-node@v4"
with:
cache: "yarn"
- run: "pipx install poetry"
- uses: "actions/setup-python@v5"
with: with:
python-version: "3.11" python-version: "3.11"
pypi_token: "${{ secrets.PYPI_TOKEN }}" cache: "poetry"
- run: "poetry install"
- run: "poetry config pypi-token.pypi ${{ secrets.PYPI_TOKEN }}"
- run: "poetry publish --build"

View File

@@ -10,19 +10,19 @@ in a web browser.
- All [GLTF 2.0](https://www.khronos.org/gltf/) features (textures, PBR materials, animations...). - All [GLTF 2.0](https://www.khronos.org/gltf/) features (textures, PBR materials, animations...).
- All [model-viewer](https://modelviewer.dev/) features (smooth controls, augmented reality...). - All [model-viewer](https://modelviewer.dev/) features (smooth controls, augmented reality...).
- Load multiple models at once, load external models and even images as quads. - Load multiple models at once, load external models and even images as quads.
- View and interact with topological entities: faces, edges, vertices and locations.
- Control clipping planes and transparency of each model. - Control clipping planes and transparency of each model.
- View and interact with topological entities: faces, edges, vertices and locations.
- Select any entity and measure bounding box size and distances. - Select any entity and measure bounding box size and distances.
- Fully-featured static deployment: just upload the viewer and models to your server.
- Hot reloading while editing the CAD model (using the `yacv-server` package). - Hot reloading while editing the CAD model (using the `yacv-server` package).
- Fully-featured static deployment: just upload the viewer and models to your server.
## Usage ## Usage
The [example](example) is a fully working project that shows how to use the viewer. The [example](example) is a fully working project that shows how to use the viewer.
You can play with the latest You can play with the latest
demo [here](https://yeicor-3d.github.io/yet-another-cad-viewer/?preload=base.glb&preload=fox.glb&preload=img.jpg.glb&preload=location.glb) demo [here](https://yeicor-3d.github.io/yet-another-cad-viewer/?preload=logo.glb&preload=fox.glb&preload=img.jpg.glb&preload=location.glb)
(or (or
[without animation](https://yeicor-3d.github.io/yet-another-cad-viewer/?autoplay=false&preload=base.glb&preload=fox.glb&preload=img.jpg.glb&preload=location.glb)). [without animation](https://yeicor-3d.github.io/yet-another-cad-viewer/?autoplay=false&preload=logo.glb&preload=fox.glb&preload=img.jpg.glb&preload=location.glb)).
![Demo](assets/screenshot.png) ![Demo](assets/screenshot.png)

View File

@@ -35,16 +35,30 @@ provide('disableTap', {disableTap, setDisableTap});
async function onModelUpdateRequest(event: NetworkUpdateEvent) { async function onModelUpdateRequest(event: NetworkUpdateEvent) {
// Load/unload a new batch of models to optimize rendering time // Load/unload a new batch of models to optimize rendering time
console.log("Received model update request", event.models); console.log("Received model update request", event.models);
let shutdownRequestIndex = event.models.findIndex((model) => model.isRemove == null);
let shutdownRequest = null;
if (shutdownRequestIndex !== -1) {
console.log("Will shut down the connection after this load, as requested by the server");
shutdownRequest = event.models.splice(shutdownRequestIndex, 1)[0];
}
let doc = sceneDocument.value; let doc = sceneDocument.value;
for (let modelIndex in event.models) { for (let modelIndex in event.models) {
let isLast = parseInt(modelIndex) === event.models.length - 1; let isLast = parseInt(modelIndex) === event.models.length - 1;
let model = event.models[modelIndex]; let model = event.models[modelIndex];
if (!model.isRemove) { try {
doc = await SceneMgr.loadModel(sceneUrl, doc, model.name, model.url, isLast, isLast); if (!model.isRemove) {
} else { doc = await SceneMgr.loadModel(sceneUrl, doc, model.name, model.url, isLast, isLast);
doc = await SceneMgr.removeModel(sceneUrl, doc, model.name, isLast); } else {
doc = await SceneMgr.removeModel(sceneUrl, doc, model.name, isLast);
}
} catch (e) {
console.error("Error loading model", model, e);
} }
} }
if (shutdownRequest !== null) {
console.log("Shutting down the connection as requested by the server");
event.disconnectForALittleBit();
}
sceneDocument.value = doc sceneDocument.value = doc
triggerRef(sceneDocument); // Why not triggered automatically? triggerRef(sceneDocument); // Why not triggered automatically?
} }

View File

@@ -12,9 +12,10 @@ export let extrasNameValueHelpers = "__helpers";
* *
* Remember to call mergeFinalize after all models have been merged (slower required operations). * Remember to call mergeFinalize after all models have been merged (slower required operations).
*/ */
export async function mergePartial(url: string, name: string, document: Document): Promise<Document> { export async function mergePartial(url: string, name: string, document: Document, networkFinished: () => void = () => {}): Promise<Document> {
// Load the new document // Load the new document
let newDoc = await io.read(url); let newDoc = await io.read(url);
networkFinished()
// Remove any previous model with the same name // Remove any previous model with the same name
await document.transform(dropByName(name)); await document.transform(dropByName(name));

View File

@@ -7,22 +7,24 @@ class NetworkUpdateEventModel {
url: string; url: string;
// TODO: Detect and manage instances of the same object (same hash, different name) // TODO: Detect and manage instances of the same object (same hash, different name)
hash: string | null; hash: string | null;
isRemove: boolean; isRemove: boolean | null; // This is null for a shutdown event
constructor(name: string, url: string, hash: string | null, isDelete: boolean) { constructor(name: string, url: string, hash: string | null, isRemove: boolean | null) {
this.name = name; this.name = name;
this.url = url; this.url = url;
this.hash = hash; this.hash = hash;
this.isRemove = isDelete; this.isRemove = isRemove;
} }
} }
export class NetworkUpdateEvent extends Event { export class NetworkUpdateEvent extends Event {
models: NetworkUpdateEventModel[]; models: NetworkUpdateEventModel[];
disconnectForALittleBit: () => void;
constructor(models: NetworkUpdateEventModel[]) { constructor(models: NetworkUpdateEventModel[], disconnectForALittleBit: () => void) {
super("update"); super("update");
this.models = models; this.models = models;
this.disconnectForALittleBit = disconnectForALittleBit;
} }
} }
@@ -57,37 +59,57 @@ export class NetworkManager extends EventTarget {
} }
} }
private async monitorDevServer(url: URL) { private async monitorDevServer(url: URL, pendingTimeout: { id: number } = {id: -1}) {
try { try {
// WARNING: This will spam the console logs with failed requests when the server is down // WARNING: This will spam the console logs with failed requests when the server is down
let response = await fetch(url.toString()); const controller = new AbortController();
let response = await fetch(url.toString(), {signal: controller.signal});
// console.log("Monitoring", url.toString(), response); // console.log("Monitoring", url.toString(), response);
if (response.status === 200) { if (response.status === 200) {
let lines = readLinesStreamings(response.body!.getReader()); let lines = readLinesStreamings(response.body!.getReader());
for await (let line of lines) { for await (let line of lines) {
if (!line || !line.startsWith("data:")) continue; if (!line || !line.startsWith("data:")) continue;
let data = JSON.parse(line.slice(5)); let data: { name: string, hash: string, is_remove: boolean | null } = JSON.parse(line.slice(5));
// console.debug("WebSocket message", data); // console.debug("WebSocket message", data);
let urlObj = new URL(url); let urlObj = new URL(url);
urlObj.searchParams.delete("api_updates"); urlObj.searchParams.delete("api_updates");
urlObj.searchParams.set("api_object", data.name); urlObj.searchParams.set("api_object", data.name);
this.foundModel(data.name, data.hash, urlObj.toString(), data.is_remove); this.foundModel(data.name, data.hash, urlObj.toString(), data.is_remove, async () => {
console.log("Disconnecting for a little bit");
controller.abort();
clearTimeout(pendingTimeout.id!);
pendingTimeout.id = -2;
setTimeout(() => {
console.log("Reconnecting after a little bit");
this.monitorDevServer(url, pendingTimeout)
}, settings.monitorEveryMs * 50);
});
} }
} }
} catch (e) { // Ignore errors (retry very soon) } catch (e) { // Ignore errors (retry very soon)
} }
setTimeout(() => this.monitorDevServer(url), settings.monitorEveryMs); if (pendingTimeout.id >= -1) {
pendingTimeout.id = setTimeout(() => {
console.log("Reconnecting fast");
this.monitorDevServer(url, pendingTimeout)
}, settings.monitorEveryMs);
}
return; return;
} }
private foundModel(name: string, hash: string | null, url: string, isRemove: boolean) { private foundModel(name: string, hash: string | null, url: string, isRemove: boolean | null, disconnectForALittleBit: () => void = () => {
}) {
let prevHash = this.knownObjectHashes[name]; let prevHash = this.knownObjectHashes[name];
// console.debug("Found model", name, "with hash", hash, "and previous hash", prevHash); // console.debug("Found model", name, "with hash", hash, "and previous hash", prevHash);
if (!hash || hash !== prevHash || isRemove) { if (!hash || hash !== prevHash || isRemove) {
if (!isRemove) { // Update known hashes
if (isRemove == false) {
this.knownObjectHashes[name] = hash; this.knownObjectHashes[name] = hash;
} else { } else if (isRemove == true) {
if (!(name in this.knownObjectHashes)) return; // Nothing to remove...
delete this.knownObjectHashes[name]; delete this.knownObjectHashes[name];
// Also update buffered updates if the model is removed
//this.bufferedUpdates = this.bufferedUpdates.filter(m => m.name !== name);
} }
let newModel = new NetworkUpdateEventModel(name, url, hash, isRemove); let newModel = new NetworkUpdateEventModel(name, url, hash, isRemove);
this.bufferedUpdates.push(newModel); this.bufferedUpdates.push(newModel);
@@ -95,7 +117,7 @@ export class NetworkManager extends EventTarget {
// Optimization: try to batch updates automatically for faster rendering // Optimization: try to batch updates automatically for faster rendering
if (this.batchTimeout !== null) clearTimeout(this.batchTimeout); if (this.batchTimeout !== null) clearTimeout(this.batchTimeout);
this.batchTimeout = setTimeout(() => { this.batchTimeout = setTimeout(() => {
this.dispatchEvent(new NetworkUpdateEvent(this.bufferedUpdates)); this.dispatchEvent(new NetworkUpdateEvent(this.bufferedUpdates, disconnectForALittleBit));
this.bufferedUpdates = []; this.bufferedUpdates = [];
}, batchTimeout); }, batchTimeout);
} }

View File

@@ -9,11 +9,11 @@ import {Matrix4} from "three/src/math/Matrix4.js"
/** This class helps manage SceneManagerData. All methods are static to support reactivity... */ /** This class helps manage SceneManagerData. All methods are static to support reactivity... */
export class SceneMgr { export class SceneMgr {
/** Loads a GLB model from a URL and adds it to the viewer or replaces it if the names match */ /** Loads a GLB model from a URL and adds it to the viewer or replaces it if the names match */
static async loadModel(sceneUrl: Ref<string>, document: Document, name: string, url: string, updateHelpers: boolean = true, reloadScene: boolean = true): Promise<Document> { static async loadModel(sceneUrl: Ref<string>, document: Document, name: string, url: string, updateHelpers: boolean = true, reloadScene: boolean = true, networkFinished: () => void = () => {}): Promise<Document> {
let loadStart = performance.now(); let loadStart = performance.now();
// Start merging into the current document, replacing or adding as needed // Start merging into the current document, replacing or adding as needed
document = await mergePartial(url, name, document); document = await mergePartial(url, name, document, networkFinished);
console.log("Model", name, "loaded in", performance.now() - loadStart, "ms"); console.log("Model", name, "loaded in", performance.now() - loadStart, "ms");

View File

@@ -3,7 +3,6 @@ import {settings} from "../misc/settings";
import {inject, onMounted, type Ref, ref, watch} from "vue"; import {inject, onMounted, type Ref, ref, watch} from "vue";
import {VList, VListItem} from "vuetify/lib/components/index.mjs"; import {VList, VListItem} from "vuetify/lib/components/index.mjs";
import {$renderer, $scene} from "@google/model-viewer/lib/model-viewer-base"; import {$renderer, $scene} from "@google/model-viewer/lib/model-viewer-base";
import Loading from "../misc/Loading.vue";
import {ModelViewerElement} from '@google/model-viewer'; import {ModelViewerElement} from '@google/model-viewer';
import type {ModelScene} from "@google/model-viewer/lib/three-components/ModelScene"; import type {ModelScene} from "@google/model-viewer/lib/three-components/ModelScene";
import {Hotspot} from "@google/model-viewer/lib/three-components/Hotspot"; import {Hotspot} from "@google/model-viewer/lib/three-components/Hotspot";
@@ -154,7 +153,7 @@ watch(disableTap, (value) => {
<v-list v-for="src in settings.preload" :key="src"> <v-list v-for="src in settings.preload" :key="src">
<v-list-item>{{ src }}</v-list-item> <v-list-item>{{ src }}</v-list-item>
</v-list> </v-list>
<loading></loading> <!-- Too much idle CPU usage: <loading></loading> -->
</div> </div>
</model-viewer> </model-viewer>

View File

@@ -1,6 +1,6 @@
{ {
"name": "yet-another-cad-viewer", "name": "yet-another-cad-viewer",
"version": "0.6.0", "version": "0.6.16",
"description": "", "description": "",
"license": "MIT", "license": "MIT",
"private": true, "private": true,
@@ -16,7 +16,7 @@
}, },
"dependencies": { "dependencies": {
"@gltf-transform/core": "^3.10.0", "@gltf-transform/core": "^3.10.0",
"@gltf-transform/functions": "^3.10.0", "@gltf-transform/functions": "^3.10.1",
"@google/model-viewer": "^3.4.0", "@google/model-viewer": "^3.4.0",
"@jamescoyle/vue-icon": "^0.1.2", "@jamescoyle/vue-icon": "^0.1.2",
"@mdi/js": "^7.4.47", "@mdi/js": "^7.4.47",
@@ -24,11 +24,11 @@
"three": "^0.160.1", "three": "^0.160.1",
"three-orientation-gizmo": "https://github.com/jrj2211/three-orientation-gizmo", "three-orientation-gizmo": "https://github.com/jrj2211/three-orientation-gizmo",
"vue": "^3.4.21", "vue": "^3.4.21",
"vuetify": "^3.5.8" "vuetify": "^3.5.9"
}, },
"devDependencies": { "devDependencies": {
"@tsconfig/node20": "^20.1.2", "@tsconfig/node20": "^20.1.2",
"@types/node": "^20.11.25", "@types/node": "^20.11.28",
"@types/three": "^0.160.0", "@types/three": "^0.160.0",
"@vitejs/plugin-vue": "^5.0.3", "@vitejs/plugin-vue": "^5.0.3",
"@vitejs/plugin-vue-jsx": "^3.1.0", "@vitejs/plugin-vue-jsx": "^3.1.0",
@@ -37,9 +37,9 @@
"commander": "^12.0.0", "commander": "^12.0.0",
"generate-license-file": "^3.0.1", "generate-license-file": "^3.0.1",
"npm-run-all2": "^6.1.1", "npm-run-all2": "^6.1.1",
"terser": "^5.29.1", "terser": "^5.29.2",
"typescript": "~5.4.2", "typescript": "~5.4.2",
"vite": "^5.1.5", "vite": "^5.1.6",
"vue-tsc": "^2.0.6" "vue-tsc": "^2.0.6"
} }
} }

13
poetry.lock generated
View File

@@ -318,17 +318,6 @@ qtconsole = ["qtconsole"]
test = ["pickleshare", "pytest (<7.1)", "pytest-asyncio (<0.22)", "testpath"] test = ["pickleshare", "pytest (<7.1)", "pytest-asyncio (<0.22)", "testpath"]
test-extra = ["curio", "matplotlib (!=3.2.0)", "nbformat", "numpy (>=1.22)", "pandas", "pickleshare", "pytest (<7.1)", "pytest-asyncio (<0.22)", "testpath", "trio"] test-extra = ["curio", "matplotlib (!=3.2.0)", "nbformat", "numpy (>=1.22)", "pandas", "pickleshare", "pytest (<7.1)", "pytest-asyncio (<0.22)", "testpath", "trio"]
[[package]]
name = "iterators"
version = "0.2.0"
description = "Iterator utility classes and functions"
optional = false
python-versions = ">=3.6"
files = [
{file = "iterators-0.2.0-py3-none-any.whl", hash = "sha256:1d7ff03f576c9de0e01bac66209556c066d6b1fc45583a99cfc9f4645be7900e"},
{file = "iterators-0.2.0.tar.gz", hash = "sha256:e9927a1ea1ef081830fd1512f3916857c36bd4b37272819a6cd29d0f44431b97"},
]
[[package]] [[package]]
name = "jedi" name = "jedi"
version = "0.19.1" version = "0.19.1"
@@ -962,4 +951,4 @@ files = [
[metadata] [metadata]
lock-version = "2.0" lock-version = "2.0"
python-versions = "^3.9" python-versions = "^3.9"
content-hash = "d9746e99dd8861758730e68d12dc72d9ec5fb0101b3c070a7d7a373439c658a0" content-hash = "567ef9c980c250ace7e380098b810250a36b92dd2e824b5b4f4851898a675e09"

View File

@@ -1,6 +1,6 @@
[tool.poetry] [tool.poetry]
name = "yacv-server" name = "yacv-server"
version = "0.6.0" version = "0.6.16"
description = "Yet Another CAD Viewer (server)" description = "Yet Another CAD Viewer (server)"
authors = ["Yeicor <4929005+Yeicor@users.noreply.github.com>"] authors = ["Yeicor <4929005+Yeicor@users.noreply.github.com>"]
license = "MIT" license = "MIT"
@@ -19,7 +19,6 @@ build123d = "^0.4.0"
# Misc # Misc
pygltflib = "^1.16.2" pygltflib = "^1.16.2"
pillow = "^10.2.0" pillow = "^10.2.0"
iterators = "^0.2.0"
[tool.poetry.build] [tool.poetry.build]
generate-setup-file = false generate-setup-file = false

View File

@@ -1,6 +1,6 @@
import os import os
from cad import image_to_gltf from yacv_server.cad import image_to_gltf
from yacv_server.yacv import YACV from yacv_server.yacv import YACV
yacv = YACV() yacv = YACV()

View File

@@ -26,6 +26,8 @@ class GLTFMgr:
textures=[Texture(source=0, sampler=0)], textures=[Texture(source=0, sampler=0)],
images=[Image(bufferView=0, mimeType=image[1])], images=[Image(bufferView=0, mimeType=image[1])],
) )
# TODO: Reduce the number of draw calls by merging all faces into a single primitive, and using
# color attributes + extension? to differentiate them (same for edges and vertices)
self.gltf.set_binary_blob(image[0]) self.gltf.set_binary_blob(image[0])
def add_face(self, vertices_raw: List[Tuple[float, float, float]], indices_raw: List[Tuple[int, int, int]], def add_face(self, vertices_raw: List[Tuple[float, float, float]], indices_raw: List[Tuple[int, int, int]],

View File

@@ -1,13 +1,10 @@
import io import io
import os import os
import threading
import urllib.parse import urllib.parse
from http import HTTPStatus from http import HTTPStatus
from http.server import SimpleHTTPRequestHandler from http.server import SimpleHTTPRequestHandler
from iterators import TimeoutIterator from yacv_server.mylogger import logger
from mylogger import logger
# Find the frontend folder (optional, but recommended) # Find the frontend folder (optional, but recommended)
FILE_DIR = os.path.dirname(__file__) FILE_DIR = os.path.dirname(__file__)
@@ -26,13 +23,9 @@ OBJECTS_API_PATH = '/api/object' # /{name}
class HTTPHandler(SimpleHTTPRequestHandler): class HTTPHandler(SimpleHTTPRequestHandler):
yacv: 'yacv.YACV' yacv: 'yacv.YACV'
frontend_lock: threading.Lock # To avoid exiting too early while frontend makes requests
at_least_one_client: threading.Event
def __init__(self, *args, yacv: 'yacv.YACV', **kwargs): def __init__(self, *args, yacv: 'yacv.YACV', **kwargs):
self.yacv = yacv self.yacv = yacv
self.frontend_lock = threading.Lock()
self.at_least_one_client = threading.Event()
super().__init__(*args, **kwargs, directory=FRONTEND_BASE_PATH) super().__init__(*args, **kwargs, directory=FRONTEND_BASE_PATH)
def log_message(self, fmt, *args): def log_message(self, fmt, *args):
@@ -77,69 +70,65 @@ class HTTPHandler(SimpleHTTPRequestHandler):
def _api_updates(self): def _api_updates(self):
"""Handles a publish-only websocket connection that send show_object events along with their hashes and URLs""" """Handles a publish-only websocket connection that send show_object events along with their hashes and URLs"""
self.send_response(HTTPStatus.OK)
self.send_header("Content-Type", "text/event-stream")
self.send_header("Cache-Control", "no-cache")
# Chunked transfer encoding!
self.send_header("Transfer-Encoding", "chunked")
self.end_headers()
self.at_least_one_client.set()
logger.debug('Updates client connected')
def write_chunk(_chunk_data: str): # Keep a shared read lock to know if any frontend is still working before shutting down
self.wfile.write(hex(len(_chunk_data))[2:].encode('utf-8')) with self.yacv.frontend_lock.r_locked():
self.wfile.write(b'\r\n')
self.wfile.write(_chunk_data.encode('utf-8'))
self.wfile.write(b'\r\n')
self.wfile.flush()
write_chunk('retry: 100\n\n') # Avoid accepting new connections while shutting down
if self.yacv.shutting_down.is_set() and not self.yacv.at_least_one_client.is_set():
self.send_error(HTTPStatus.SERVICE_UNAVAILABLE, 'Server is shutting down')
return
self.yacv.at_least_one_client.set()
logger.debug('Updates client connected')
# Send buffered events first, while keeping a lock self.send_response(HTTPStatus.OK)
with self.frontend_lock: self.send_header("Content-Type", "text/event-stream")
for data in self.yacv.show_events.buffer(): self.send_header("Cache-Control", "no-cache")
logger.debug('Sending info about %s: %s', data.name, data) # Chunked transfer encoding!
# noinspection PyUnresolvedReferences self.send_header("Transfer-Encoding", "chunked")
to_send = data.to_json() self.end_headers()
write_chunk(f'data: {to_send}\n\n')
# Send future events over the same connection def write_chunk(_chunk_data: str):
# Also send keep-alive to know if the client is still connected self.wfile.write(hex(len(_chunk_data))[2:].encode('utf-8'))
subscription = self.yacv.show_events.subscribe(include_buffered=False) self.wfile.write(b'\r\n')
it = TimeoutIterator(subscription, sentinel=None, reset_on_next=True, timeout=5.0) # Keep-alive interval self.wfile.write(_chunk_data.encode('utf-8'))
try: self.wfile.write(b'\r\n')
for data in it: self.wfile.flush()
if data is None:
write_chunk(':keep-alive\n\n') write_chunk('retry: 100\n\n')
else:
logger.debug('Sending info about %s: %s', data.name, data) subscription = self.yacv.show_events.subscribe(yield_timeout=1.0) # Keep-alive interval
# noinspection PyUnresolvedReferences
to_send = data.to_json()
write_chunk(f'data: {to_send}\n\n')
except BrokenPipeError: # Client disconnected normally
pass
finally:
logger.debug('Updates client disconnected')
try: try:
it.interrupt() for data in subscription:
next(it) # Make sure the iterator is interrupted before trying to close the subscription if data is None:
write_chunk(':keep-alive\n\n')
else:
logger.debug('Sending info about %s: %s', data.name, data)
# noinspection PyUnresolvedReferences
to_send = data.to_json()
write_chunk(f'data: {to_send}\n\n')
except BrokenPipeError: # Client disconnected normally
pass
finally:
subscription.close() subscription.close()
except BaseException as e:
logger.debug('Ignoring error while closing subscription: %s', e) logger.debug('Updates client disconnected')
def _api_object(self, obj_name: str): def _api_object(self, obj_name: str):
"""Returns the object file with the matching name, building it if necessary.""" """Returns the object file with the matching name, building it if necessary."""
with self.frontend_lock: # Export the object (or fail if not found)
# Export the object (or fail if not found) _export = self.yacv.export(obj_name)
exported_glb = self.yacv.export(obj_name) if _export is None:
if exported_glb is None: self.send_error(HTTPStatus.NOT_FOUND, f'Object {obj_name} not found')
self.send_error(HTTPStatus.NOT_FOUND, f'Object {obj_name} not found') return io.BytesIO()
return io.BytesIO()
# Wrap the GLB in a response and return it exported_glb, _hash = _export
self.send_response(HTTPStatus.OK)
self.send_header('Content-Type', 'model/gltf-binary') # Wrap the GLB in a response and return it
self.send_header('Content-Length', str(len(exported_glb))) self.send_response(HTTPStatus.OK)
self.send_header('Content-Disposition', f'attachment; filename="{obj_name}.glb"') self.send_header('Content-Type', 'model/gltf-binary')
self.end_headers() self.send_header('Content-Length', str(len(exported_glb)))
self.wfile.write(exported_glb) self.send_header('Content-Disposition', f'attachment; filename="{obj_name}.glb"')
self.send_header('E-Tag', f'"{_hash}"')
self.end_headers()
self.wfile.write(exported_glb)

View File

@@ -1,4 +1,4 @@
import threading import queue
import queue import queue
import threading import threading
from typing import List, TypeVar, \ from typing import List, TypeVar, \
@@ -8,6 +8,8 @@ from yacv_server.mylogger import logger
T = TypeVar('T') T = TypeVar('T')
_end_of_queue = object()
class BufferedPubSub(Generic[T]): class BufferedPubSub(Generic[T]):
"""A simple implementation of publish-subscribe pattern using threading and buffering all previous events""" """A simple implementation of publish-subscribe pattern using threading and buffering all previous events"""
@@ -45,7 +47,7 @@ class BufferedPubSub(Generic[T]):
for event in self._buffer: for event in self._buffer:
q.put(event) q.put(event)
if not include_future: if not include_future:
q.put(None) q.put(_end_of_queue)
return q return q
def _unsubscribe(self, q: queue.Queue[T]): def _unsubscribe(self, q: queue.Queue[T]):
@@ -54,14 +56,18 @@ class BufferedPubSub(Generic[T]):
self._subscribers.remove(q) self._subscribers.remove(q)
logger.debug(f"Unsubscribed from %s (%d subscribers)", self, len(self._subscribers)) logger.debug(f"Unsubscribed from %s (%d subscribers)", self, len(self._subscribers))
def subscribe(self, include_buffered: bool = True, include_future: bool = True) -> Generator[T, None, None]: def subscribe(self, include_buffered: bool = True, include_future: bool = True, yield_timeout: float = 0.0) -> \
Generator[T, None, None]:
"""Subscribes to events as an generator that yields events and automatically unsubscribes""" """Subscribes to events as an generator that yields events and automatically unsubscribes"""
q = self._subscribe(include_buffered, include_future) q = self._subscribe(include_buffered, include_future)
try: try:
while True: while True:
v = q.get() try:
v = q.get(timeout=yield_timeout)
except queue.Empty:
v = None
# include_future is incompatible with None values as they are used to signal the end of the stream # include_future is incompatible with None values as they are used to signal the end of the stream
if v is None and not include_future: if v is _end_of_queue:
break break
yield v yield v
finally: # When aclose() is called finally: # When aclose() is called
@@ -80,4 +86,4 @@ class BufferedPubSub(Generic[T]):
def clear(self): def clear(self):
"""Clears the buffer""" """Clears the buffer"""
with self._buffer_lock: with self._buffer_lock:
self._buffer.clear() self._buffer.clear()

96
yacv_server/rwlock.py Normal file
View File

@@ -0,0 +1,96 @@
# -*- coding: utf-8 -*-
""" rwlock.py
A class to implement read-write locks on top of the standard threading
library.
This is implemented with two mutexes (threading.Lock instances) as per this
wikipedia pseudocode:
https://en.wikipedia.org/wiki/Readers%E2%80%93writer_lock#Using_two_mutexes
Code written by Tyler Neylon at Unbox Research.
This file is public domain.
"""
# _______________________________________________________________________
# Imports
from contextlib import contextmanager
from threading import Lock
# _______________________________________________________________________
# Class
class RWLock(object):
""" RWLock class; this is meant to allow an object to be read from by
multiple threads, but only written to by a single thread at a time. See:
https://en.wikipedia.org/wiki/Readers%E2%80%93writer_lock
Usage:
from rwlock import RWLock
my_obj_rwlock = RWLock()
# When reading from my_obj:
with my_obj_rwlock.r_locked():
do_read_only_things_with(my_obj)
# When writing to my_obj:
with my_obj_rwlock.w_locked():
mutate(my_obj)
"""
def __init__(self):
self.w_lock = Lock()
self.num_r_lock = Lock()
self.num_r = 0
# ___________________________________________________________________
# Reading methods.
def r_acquire(self, *args, **kwargs):
self.num_r_lock.acquire(*args, **kwargs)
self.num_r += 1
if self.num_r == 1:
self.w_lock.acquire(*args, **kwargs)
self.num_r_lock.release()
def r_release(self, *args, **kwargs):
assert self.num_r > 0
self.num_r_lock.acquire(*args, **kwargs)
self.num_r -= 1
if self.num_r == 0:
self.w_lock.release()
self.num_r_lock.release()
@contextmanager
def r_locked(self, *args, **kwargs):
""" This method is designed to be used via the `with` statement. """
try:
self.r_acquire(*args, **kwargs)
yield
finally:
self.r_release()
# ___________________________________________________________________
# Writing methods.
def w_acquire(self, *args, **kwargs):
self.w_lock.acquire(*args, **kwargs)
def w_release(self):
self.w_lock.release()
@contextmanager
def w_locked(self, *args, **kwargs):
""" This method is designed to be used via the `with` statement. """
try:
self.w_acquire(*args, **kwargs)
yield
finally:
self.w_release()

View File

@@ -10,7 +10,7 @@ from dataclasses import dataclass
from http.server import ThreadingHTTPServer from http.server import ThreadingHTTPServer
from importlib.metadata import version from importlib.metadata import version
from threading import Thread from threading import Thread
from typing import Optional, Dict, Union, Callable, List from typing import Optional, Dict, Union, Callable, List, Tuple
from OCP.TopLoc import TopLoc_Location from OCP.TopLoc import TopLoc_Location
from OCP.TopoDS import TopoDS_Shape from OCP.TopoDS import TopoDS_Shape
@@ -18,8 +18,9 @@ from OCP.TopoDS import TopoDS_Shape
from build123d import Shape, Axis, Location, Vector from build123d import Shape, Axis, Location, Vector
from dataclasses_json import dataclass_json from dataclasses_json import dataclass_json
from myhttp import HTTPHandler from rwlock import RWLock
from yacv_server.cad import get_shape, grab_all_cad, CADCoreLike, CADLike from yacv_server.cad import get_shape, grab_all_cad, CADCoreLike, CADLike
from yacv_server.myhttp import HTTPHandler
from yacv_server.mylogger import logger from yacv_server.mylogger import logger
from yacv_server.pubsub import BufferedPubSub from yacv_server.pubsub import BufferedPubSub
from yacv_server.tessellate import _hashcode, tessellate from yacv_server.tessellate import _hashcode, tessellate
@@ -33,8 +34,8 @@ class UpdatesApiData:
"""Name of the object. Should be unique unless you want to overwrite the previous object""" """Name of the object. Should be unique unless you want to overwrite the previous object"""
hash: str hash: str
"""Hash of the object, to detect changes without rebuilding the object""" """Hash of the object, to detect changes without rebuilding the object"""
is_remove: bool is_remove: Optional[bool]
"""Whether to remove the object from the scene""" """Whether to remove the object from the scene. If None, this is a shutdown request"""
YACVSupported = Union[bytes, CADCoreLike] YACVSupported = Union[bytes, CADCoreLike]
@@ -46,7 +47,7 @@ class UpdatesApiFullData(UpdatesApiData):
kwargs: Optional[Dict[str, any]] kwargs: Optional[Dict[str, any]]
"""The show_object options, if any (not serialized)""" """The show_object options, if any (not serialized)"""
def __init__(self, obj: YACVSupported, name: str, _hash: str, is_remove: bool = False, def __init__(self, obj: YACVSupported, name: str, _hash: str, is_remove: Optional[bool] = False,
kwargs: Optional[Dict[str, any]] = None): kwargs: Optional[Dict[str, any]] = None):
self.name = name self.name = name
self.hash = _hash self.hash = _hash
@@ -60,22 +61,42 @@ class UpdatesApiFullData(UpdatesApiData):
class YACV: class YACV:
"""The main yacv_server class, which manages the web server and the CAD objects."""
# Startup
server_thread: Optional[Thread] server_thread: Optional[Thread]
"""The main thread running the server (will spawn other threads for each request)"""
server: Optional[ThreadingHTTPServer] server: Optional[ThreadingHTTPServer]
"""The server object"""
startup_complete: threading.Event startup_complete: threading.Event
"""Event to signal when the server has started"""
# Running
show_events: BufferedPubSub[UpdatesApiFullData] show_events: BufferedPubSub[UpdatesApiFullData]
"""PubSub for show events (objects to be shown in/removed from the scene)"""
build_events: Dict[str, BufferedPubSub[bytes]] build_events: Dict[str, BufferedPubSub[bytes]]
object_events_lock: threading.Lock """PubSub for build events (objects that were built)"""
build_events_lock: threading.Lock
"""Lock to ensure that objects are only built once"""
# Shutdown
at_least_one_client: threading.Event
"""Event to signal when at least one client has connected"""
shutting_down: threading.Event
"""Event to signal when the server is shutting down"""
frontend_lock: RWLock
"""Lock to ensure that the frontend has finished working before we shut down"""
def __init__(self): def __init__(self):
self.server_thread = None self.server_thread = None
self.server = None self.server = None
self.startup_complete = threading.Event() self.startup_complete = threading.Event()
self.at_least_one_client = threading.Event()
self.show_events = BufferedPubSub() self.show_events = BufferedPubSub()
self.build_events = {} self.build_events = {}
self.object_events_lock = threading.Lock() self.build_events_lock = threading.Lock()
self.frontend_lock = threading.Lock() self.at_least_one_client = threading.Event()
self.shutting_down = threading.Event()
self.frontend_lock = RWLock()
logger.info('Using yacv-server v%s', version('yacv-server')) logger.info('Using yacv-server v%s', version('yacv-server'))
def start(self): def start(self):
@@ -100,38 +121,36 @@ class YACV:
logger.error('Cannot stop server because it is not running') logger.error('Cannot stop server because it is not running')
return return
# Inform the server that we are shutting down
self.shutting_down.set()
# noinspection PyTypeChecker
self.show_events.publish(UpdatesApiFullData(name='__shutdown', _hash='', is_remove=None, obj=None))
# If we were too fast, ensure that at least one client has connected
graceful_secs_connect = float(os.getenv('YACV_GRACEFUL_SECS_CONNECT', 12.0)) graceful_secs_connect = float(os.getenv('YACV_GRACEFUL_SECS_CONNECT', 12.0))
graceful_secs_request = float(os.getenv('YACV_GRACEFUL_SECS_REQUEST', 5.0)) if graceful_secs_connect > 0:
# Make sure we can hold the lock for more than 100ms (to avoid exiting too early) start = time.time()
logger.info('Stopping server (waiting for at least one frontend request first, cancel with CTRL+C)...') try:
start = time.time() if not self.at_least_one_client.is_set():
try: logger.warning(
while not self.at_least_one_client.wait( 'Waiting for at least one frontend request before stopping server, cancel with CTRL+C...')
graceful_secs_connect / 10) and time.time() - start < graceful_secs_connect: while (not self.at_least_one_client.wait(graceful_secs_connect / 10) and
time.sleep(0.01) time.time() - start < graceful_secs_connect):
except KeyboardInterrupt: time.sleep(0.01)
pass except KeyboardInterrupt:
pass
logger.info('Stopping server (waiting for no more frontend requests)...') # Wait for the server to stop gracefully (all frontends to stop working)
start = time.time() graceful_secs_request = float(os.getenv('YACV_GRACEFUL_SECS_WORK', 1000000))
try: with self.frontend_lock.w_locked(timeout=graceful_secs_request):
while time.time() - start < graceful_secs_request: # Stop the server
if self.frontend_lock.locked(): self.server.shutdown()
start = time.time()
time.sleep(0.01)
except KeyboardInterrupt:
pass
# Stop the server in the background # Wait for the server thread to stop
self.server.shutdown() self.server_thread.join(timeout=30)
logger.info('Stopping server (sent)...') self.server_thread = None
if len(args) >= 1 and args[0] in (signal.SIGINT, signal.SIGTERM):
# Wait for the server to stop gracefully sys.exit(0) # Exit with success
self.server_thread.join(timeout=30)
self.server_thread = None
logger.info('Stopping server (confirmed)...')
if len(args) >= 1 and args[0] in (signal.SIGINT, signal.SIGTERM):
sys.exit(0) # Exit with success
def _run_server(self): def _run_server(self):
"""Runs the web server""" """Runs the web server"""
@@ -187,7 +206,7 @@ class YACV:
self.show_events.delete(old_show_event) self.show_events.delete(old_show_event)
# Delete any cached object builds # Delete any cached object builds
with self.object_events_lock: with self.build_events_lock:
if name in self.build_events: if name in self.build_events:
del self.build_events[name] del self.build_events[name]
@@ -206,13 +225,13 @@ class YACV:
def shown_object_names(self, apply_removes: bool = True) -> List[str]: def shown_object_names(self, apply_removes: bool = True) -> List[str]:
"""Returns the names of all objects that have been shown""" """Returns the names of all objects that have been shown"""
res = [] res = set()
for obj in self.show_events.buffer(): for obj in self.show_events.buffer():
if not obj.is_remove or not apply_removes: if not obj.is_remove or not apply_removes:
res.append(obj.name) res.add(obj.name)
else: else:
res.remove(obj.name) res.discard(obj.name)
return res return list(res)
def _show_events(self, name: str, apply_removes: bool = True) -> List[UpdatesApiFullData]: def _show_events(self, name: str, apply_removes: bool = True) -> List[UpdatesApiFullData]:
"""Returns the show events with the given name""" """Returns the show events with the given name"""
@@ -228,8 +247,8 @@ class YACV:
res.remove(old_event) res.remove(old_event)
return res return res
def export(self, name: str) -> Optional[bytes]: def export(self, name: str) -> Optional[Tuple[bytes, str]]:
"""Export the given previously-shown object to a single GLB file, building it if necessary.""" """Export the given previously-shown object to a single GLB blob, building it if necessary."""
start = time.time() start = time.time()
# Check that the object to build exists and grab it if it does # Check that the object to build exists and grab it if it does
@@ -240,7 +259,7 @@ class YACV:
event = events[-1] event = events[-1]
# Use the lock to ensure that we don't build the object twice # Use the lock to ensure that we don't build the object twice
with self.object_events_lock: with self.build_events_lock:
# If there are no object events for this name, we need to build the object # If there are no object events for this name, we need to build the object
if name not in self.build_events: if name not in self.build_events:
logger.debug('Building object %s with hash %s', name, event.hash) logger.debug('Building object %s with hash %s', name, event.hash)
@@ -266,7 +285,7 @@ class YACV:
# In either case return the elements of a subscription to the async generator # In either case return the elements of a subscription to the async generator
subscription = self.build_events[name].subscribe() subscription = self.build_events[name].subscribe()
try: try:
return next(subscription) return next(subscription), event.hash
finally: finally:
subscription.close() subscription.close()
@@ -277,7 +296,7 @@ class YACV:
for name in self.shown_object_names(): for name in self.shown_object_names():
if export_filter(name, self._show_events(name)[-1].obj): if export_filter(name, self._show_events(name)[-1].obj):
with open(os.path.join(folder, f'{name}.glb'), 'wb') as f: with open(os.path.join(folder, f'{name}.glb'), 'wb') as f:
f.write(self.export(name)) f.write(self.export(name)[0])
# noinspection PyUnusedLocal # noinspection PyUnusedLocal
@@ -302,10 +321,10 @@ def _preprocess_cad(obj: CADLike, **kwargs) -> CADCoreLike:
_find_var_name_count = 0 _find_var_name_count = 0
def _find_var_name(obj: any) -> str: def _find_var_name(obj: any, avoid_levels: int = 2) -> str:
"""A hacky way to get a stable name for an object that may change over time""" """A hacky way to get a stable name for an object that may change over time"""
global _find_var_name_count global _find_var_name_count
for frame in inspect.stack(): for frame in inspect.stack()[avoid_levels:]:
for key, value in frame.frame.f_locals.items(): for key, value in frame.frame.f_locals.items():
if value is obj: if value is obj:
return key return key

View File

@@ -390,28 +390,28 @@
resolved "https://registry.yarnpkg.com/@esbuild/win32-x64/-/win32-x64-0.19.12.tgz#c57c8afbb4054a3ab8317591a0b7320360b444ae" resolved "https://registry.yarnpkg.com/@esbuild/win32-x64/-/win32-x64-0.19.12.tgz#c57c8afbb4054a3ab8317591a0b7320360b444ae"
integrity sha512-T1QyPSDCyMXaO3pzBkF96E8xMkiRYbUEZADd29SyPGabqxMViNoii+NcK7eWJAEoU6RZyEm5lVSIjTmcdoB9HA== integrity sha512-T1QyPSDCyMXaO3pzBkF96E8xMkiRYbUEZADd29SyPGabqxMViNoii+NcK7eWJAEoU6RZyEm5lVSIjTmcdoB9HA==
"@gltf-transform/core@^3.10.0": "@gltf-transform/core@^3.10.0", "@gltf-transform/core@^3.10.1":
version "3.10.0" version "3.10.1"
resolved "https://registry.yarnpkg.com/@gltf-transform/core/-/core-3.10.0.tgz#854e7345f23971e4e7367a29183a2d1b62d45e46" resolved "https://registry.yarnpkg.com/@gltf-transform/core/-/core-3.10.1.tgz#d99c060b499482ed2c3304466405bf4c10939831"
integrity sha512-NxVKhSWvH0j1tjZE8Yl461HUMyZLmYmqcbqHw0TOcQd5Q1SV7Y5w6W68XMt9/amRfMAiJLLNREE7kbr+Z0Ydbw== integrity sha512-50OYemknGNxjBmiOM6iJp04JAu0bl9jvXJfN/gFt9QdJO02cPDcoXlTfSPJG6TVWDcfl0xPlsx1vybcbPVGFcQ==
dependencies: dependencies:
property-graph "^1.3.1" property-graph "^1.3.1"
"@gltf-transform/extensions@^3.10.0": "@gltf-transform/extensions@^3.10.1":
version "3.10.0" version "3.10.1"
resolved "https://registry.yarnpkg.com/@gltf-transform/extensions/-/extensions-3.10.0.tgz#4ae11c3fe8e2a77e6e9dd04ebf0931c7b0cd3690" resolved "https://registry.yarnpkg.com/@gltf-transform/extensions/-/extensions-3.10.1.tgz#71664389cae46fb12eb97dc71eb96d86a0d7801f"
integrity sha512-dz/cf2toBzP+w3ES2VgMiINCN6q86MVGu1lHkT0El4No77Bje9fnHVEPrKwaDCsXi5YXUiG/u6686vK6jePwDA== integrity sha512-xUS9K5fMvW2dkYN4VzxHg2aBPG54M2WqgIjQ7RoSyybMoD7DsPUyMyVgRja+aiTVt/Bxza2ve7zJBD3+tN+aTA==
dependencies: dependencies:
"@gltf-transform/core" "^3.10.0" "@gltf-transform/core" "^3.10.1"
ktx-parse "^0.6.0" ktx-parse "^0.6.0"
"@gltf-transform/functions@^3.10.0": "@gltf-transform/functions@^3.10.1":
version "3.10.0" version "3.10.1"
resolved "https://registry.yarnpkg.com/@gltf-transform/functions/-/functions-3.10.0.tgz#bf0331c109ac948d19be7394d3afcfae84215cfd" resolved "https://registry.yarnpkg.com/@gltf-transform/functions/-/functions-3.10.1.tgz#c40817740241c0ee770f4d1210ccc766e46d8ab2"
integrity sha512-FStbDaH7t2z74RyEeUQn3aBcybULbDkt72ZasC0s7DwQ2DFKKKOth4Zksi4g9+8URNM6vNa2JSfuO851dkJHEg== integrity sha512-Zs6+1qvTD9w40R5qv70E4wJXXacNQ46ZxjKKW6dmfGIyjT8bsSJmV3Tdj+WJ8R6lWXXZ8e2p3ZvAUfPDEG73bQ==
dependencies: dependencies:
"@gltf-transform/core" "^3.10.0" "@gltf-transform/core" "^3.10.1"
"@gltf-transform/extensions" "^3.10.0" "@gltf-transform/extensions" "^3.10.1"
ktx-parse "^0.6.0" ktx-parse "^0.6.0"
ndarray "^1.0.19" ndarray "^1.0.19"
ndarray-lanczos "^0.3.0" ndarray-lanczos "^0.3.0"
@@ -805,10 +805,10 @@
resolved "https://registry.yarnpkg.com/@types/ndarray/-/ndarray-1.0.14.tgz#96b28c09a3587a76de380243f87bb7a2d63b4b23" resolved "https://registry.yarnpkg.com/@types/ndarray/-/ndarray-1.0.14.tgz#96b28c09a3587a76de380243f87bb7a2d63b4b23"
integrity sha512-oANmFZMnFQvb219SSBIhI1Ih/r4CvHDOzkWyJS/XRqkMrGH5/kaPSA1hQhdIBzouaE+5KpE/f5ylI9cujmckQg== integrity sha512-oANmFZMnFQvb219SSBIhI1Ih/r4CvHDOzkWyJS/XRqkMrGH5/kaPSA1hQhdIBzouaE+5KpE/f5ylI9cujmckQg==
"@types/node@^20.11.25": "@types/node@^20.11.28":
version "20.11.25" version "20.11.28"
resolved "https://registry.yarnpkg.com/@types/node/-/node-20.11.25.tgz#0f50d62f274e54dd7a49f7704cc16bfbcccaf49f" resolved "https://registry.yarnpkg.com/@types/node/-/node-20.11.28.tgz#4fd5b2daff2e580c12316e457473d68f15ee6f66"
integrity sha512-TBHyJxk2b7HceLVGFcpAUjsa5zIdsPWlR6XHfyGzd0SFu+/NFgQgMAl96MSDZgQDvJAvV6BKsFOrt6zIL09JDw== integrity sha512-M/GPWVS2wLkSkNHVeLkrF2fD5Lx5UC4PxA0uZcKc6QqbIQUJyW1jVjueJYi1z8n0I5PxYrtpnPnWglE+y9A0KA==
dependencies: dependencies:
undici-types "~5.26.4" undici-types "~5.26.4"
@@ -2878,10 +2878,10 @@ tar@^6.1.11, tar@^6.1.2:
mkdirp "^1.0.3" mkdirp "^1.0.3"
yallist "^4.0.0" yallist "^4.0.0"
terser@^5.29.1: terser@^5.29.2:
version "5.29.1" version "5.29.2"
resolved "https://registry.yarnpkg.com/terser/-/terser-5.29.1.tgz#44e58045b70c09792ba14bfb7b4e14ca8755b9fa" resolved "https://registry.yarnpkg.com/terser/-/terser-5.29.2.tgz#c17d573ce1da1b30f21a877bffd5655dd86fdb35"
integrity sha512-lZQ/fyaIGxsbGxApKmoPTODIzELy3++mXhS5hOqaAWZjQtpq/hFHAc+rm29NND1rYRxRWKcjuARNwULNXa5RtQ== integrity sha512-ZiGkhUBIM+7LwkNjXYJq8svgkd+QK3UUr0wJqY4MieaezBSAIPgbSPZyIx0idM6XWK5CMzSWa8MJIzmRcB8Caw==
dependencies: dependencies:
"@jridgewell/source-map" "^0.3.3" "@jridgewell/source-map" "^0.3.3"
acorn "^8.8.2" acorn "^8.8.2"
@@ -2992,10 +2992,10 @@ validate-npm-package-name@^5.0.0:
dependencies: dependencies:
builtins "^5.0.0" builtins "^5.0.0"
vite@^5.1.5: vite@^5.1.6:
version "5.1.5" version "5.1.6"
resolved "https://registry.yarnpkg.com/vite/-/vite-5.1.5.tgz#bdbc2b15e8000d9cc5172f059201178f9c9de5fb" resolved "https://registry.yarnpkg.com/vite/-/vite-5.1.6.tgz#706dae5fab9e97f57578469eef1405fc483943e4"
integrity sha512-BdN1xh0Of/oQafhU+FvopafUp6WaYenLU/NFoL5WyJL++GxkNfieKzBhM24H3HVsPQrlAqB7iJYTHabzaRed5Q== integrity sha512-yYIAZs9nVfRJ/AiOLCA91zzhjsHUgMjB+EigzFb6W2XTLO8JixBCKCjvhKZaye+NKYHCrkv3Oh50dH9EdLU2RA==
dependencies: dependencies:
esbuild "^0.19.3" esbuild "^0.19.3"
postcss "^8.4.35" postcss "^8.4.35"
@@ -3031,10 +3031,10 @@ vue@^3.4.21:
"@vue/server-renderer" "3.4.21" "@vue/server-renderer" "3.4.21"
"@vue/shared" "3.4.21" "@vue/shared" "3.4.21"
vuetify@^3.5.8: vuetify@^3.5.9:
version "3.5.8" version "3.5.9"
resolved "https://registry.yarnpkg.com/vuetify/-/vuetify-3.5.8.tgz#bc8f08dfd3314640e7b5d43b50138a26d650cbbf" resolved "https://registry.yarnpkg.com/vuetify/-/vuetify-3.5.9.tgz#9cb3554f4b9bb7f3c277a10b99ab5944e6d04b03"
integrity sha512-8nGS+lKejZkev55HFwIfsRt+9fOqbeDQNmXxfmLKAlnUT8FtynVwbjAwHMtX/OQAQ3ZwRaR1ptqQQmx3OgxzbQ== integrity sha512-tA3N2uWZFNSZRFNnXN841x4rWozYXKC0fGW/mJIwcKkQiI0+gmVCETtjF8bnOS7L1s0buWzw94uYTlXQa5AQ4w==
walk-up-path@^3.0.1: walk-up-path@^3.0.1:
version "3.0.1" version "3.0.1"