8.0 KiB
Verification Guide
This document explains how to independently verify the claims made in this repository using standard system tools. Everything described here is non-destructive, legal, and requires only a licensed installation of Beeble Studio on Linux.
Prerequisites
- A Linux system with Beeble Studio installed (RPM distribution)
- Basic familiarity with the terminal
- No special tools required beyond what ships with most Linux distributions
Method 1: String search on the binary
The strings command extracts human-readable text from any binary
file. It ships with every Linux distribution as part of GNU binutils.
# Extract all readable strings from the beeble-ai binary
strings /path/to/beeble-ai | grep -i "transparent.background"
strings /path/to/beeble-ai | grep -i "inspyrenet"
strings /path/to/beeble-ai | grep -i "depth.anything"
strings /path/to/beeble-ai | grep -i "dinov2"
strings /path/to/beeble-ai | grep -i "segmentation_models"
strings /path/to/beeble-ai | grep -i "kornia"
strings /path/to/beeble-ai | grep -i "HighPerfGpuNet"
strings /path/to/beeble-ai | grep -i "switchlight"
strings /path/to/beeble-ai | grep -i "rt_detr\|rtdetr"
strings /path/to/beeble-ai | grep -i "boxmot"
strings /path/to/beeble-ai | grep -i "face_detection"
strings /path/to/beeble-ai | grep -i "dexined"
strings /path/to/beeble-ai | grep -i "rrdbnet\|super_resolution"
If the open-source models identified in this analysis are present, you will see matching strings--library names, docstrings, model checkpoint references, and configuration data.
You can also search for Python package paths:
strings /path/to/beeble-ai | grep "\.py" | grep -i "timm\|kornia\|albumentations"
To verify the architecture findings, search for TensorRT plugin names and quantized backbone patterns:
strings /path/to/beeble-ai | grep "_TRT"
strings /path/to/beeble-ai | grep "int8_resnet"
strings /path/to/beeble-ai | grep -i "encoder_name\|decoder_channels"
To verify the absence of physics-based rendering terminology:
# These searches should return no results
strings /path/to/beeble-ai | grep -i "cook.torrance\|brdf\|albedo"
strings /path/to/beeble-ai | grep -i "specular_net\|normal_net\|render_net"
strings /path/to/beeble-ai | grep -i "switchlight"
Method 2: Process memory inspection
When the application is running, you can see what shared libraries and files it has loaded.
# Find the beeble process ID
pgrep -f beeble-ai
# List all memory-mapped files
cat /proc/<PID>/maps | grep -v "\[" | awk '{print $6}' | sort -u
# Or use lsof to see open files
lsof -p <PID> | grep -i "model\|\.so\|python"
This shows which shared libraries (.so files) are loaded and which files the application has open. Look for references to Python libraries, CUDA/TensorRT files, and model data.
For deeper analysis, you can extract strings from the running process memory:
# Dump readable strings from process memory
strings /proc/<PID>/mem 2>/dev/null | grep -i "HighPerfGpuNet"
strings /proc/<PID>/mem 2>/dev/null | grep -i "encoder_name"
Method 3: RPM package contents
If Beeble Studio was installed via RPM, you can inspect the package contents without running the application.
# List all files installed by the package
rpm -ql beeble-studio
# Or if you have the RPM file
rpm -qlp beeble-studio-*.rpm
This reveals the directory structure, installed Python libraries, and bundled model files.
Method 4: Manifest inspection
The application downloads a manifest file during setup that lists the models it uses. If you have a copy of this file (it is downloaded to the application's data directory during normal operation), you can inspect it directly:
# Pretty-print and search the manifest
python3 -m json.tool manifest.json | grep -i "model\|name\|type"
A copy of this manifest is included in this repository at
evidence/manifest.json.
Method 5: Library directory inspection
The application's lib/ directory contains all bundled Python
packages. You can inventory them directly:
# List all top-level packages
ls /path/to/beeble-studio/lib/
# Check for license files
find /path/to/beeble-studio/lib/ -name "LICENSE*" -o -name "COPYING*"
# Check specific library versions
ls /path/to/beeble-studio/lib/ | grep -i "torch\|timm\|kornia"
# Count packages with and without license files
total=$(ls -d /path/to/beeble-studio/lib/*/ | wc -l)
licensed=$(find /path/to/beeble-studio/lib/ -maxdepth 2 \
-name "LICENSE*" | wc -l)
echo "$licensed of $total packages have license files"
Method 6: Electron app inspection
Beeble Studio's desktop UI is an Electron application. The compiled
JavaScript source is bundled in the application's resources/
directory and is not obfuscated. You can extract and read it:
# Find the app's asar archive or dist directory
find /path/to/beeble-studio/ -name "*.js" -path "*/dist/*"
# Search for CLI flag construction
grep -r "run-pbr\|run-alpha\|run-depth\|pbr-stride" /path/to/dist/
# Search for output channel definitions
grep -r "basecolor\|normal\|roughness\|specular\|metallic" /path/to/dist/
This reveals how the UI passes arguments to the engine binary, confirming that alpha, depth, and PBR are independent processing stages.
Method 7: PyInstaller module listing
The beeble-ai binary is a PyInstaller-packaged Python application.
You can list the bundled Python modules without executing the binary:
# PyInstaller archives have a table of contents that lists
# all bundled modules. Several open-source tools can extract this.
# Look for the pyarmor runtime and obfuscated module names
strings /path/to/beeble-ai | grep "pyarmor_runtime"
strings /path/to/beeble-ai | grep -E "^[a-z0-9]{8}\." | head -20
What to look for
When running these commands, look for:
-
Library docstrings: Complete API documentation strings from open-source packages (e.g., the
transparent-backgroundlibrary's output type options: 'rgba', 'green', 'white', 'blur', 'overlay') -
Model checkpoint names: References to specific pretrained model files (e.g.,
dinov2_vits14_pretrain.pth,depth-anything-v2-large) -
Package URLs: GitHub repository URLs for open-source projects
-
Python import paths: Module paths like
timm.models,kornia.onnx,segmentation_models_pytorch.decoders -
Runtime warnings: Messages from loaded libraries (e.g.,
WARNING:dinov2:xFormers not available) -
TensorRT plugin names: Custom plugins like
DisentangledAttention_TRT,RnRes2FullFusion_TRTthat identify specific architectures -
Quantization patterns: Strings like
int8_resnet50_stage_2_fusionthat reveal which backbones are compiled for inference -
Detection pipeline modules: Full module paths for the detection/tracking pipeline (e.g.,
kornia.contrib.models.rt_detr,kornia.models.tracking.boxmot_tracker,kornia.contrib.face_detection) -
Absent terminology: The absence of physics-based rendering terms (Cook-Torrance, BRDF, spherical harmonics) throughout the binary is itself a finding, given that the CVPR paper describes such an architecture
-
Absent branding: The term "SwitchLight" does not appear anywhere in the binary, the setup binary, or the Electron app source code. This can be verified by searching all three codebases
-
License file gap: Counting LICENSE files in the
lib/directory versus total packages reveals the scope of missing attribution
What not to do
This guide is limited to observation. The following activities are unnecessary for verification and may violate Beeble's terms of service:
- Do not attempt to decrypt the
.encmodel files - Do not decompile or disassemble the
beeble-aibinary - Do not bypass or circumvent the Pyarmor code protection
- Do not intercept network traffic between the app and Beeble's servers
- Do not extract or redistribute any model files
The methods described above are sufficient to confirm which open-source components are present and what architectural patterns they suggest. There is no need to go further than that.