Features
All 16 modules — from photo ID to AR species overlay — with version targets.
# All Modules
Multi-modal species identification using PlantNet, BirdNET, and Claude Vision ensemble
Geotagged observation log with habitat, weather, and field notes
Auto-extract EXIF, XMP, and ID3 metadata from photo, audio, and video files
Lunar cycle, solar/seasonal, precipitation, phenological auto-enrichment
Community validation with taxonomic badges, dispute resolution, and 2/3 consensus
Full offline mode with cached ONNX models, IndexedDB outbox, Darwin Core export
In-app crop, denoise, background removal — deprioritized per strategic review
Profiles, badges, levels, streaks, BioBlitz events, challenges — no dark patterns
Transect creation, waypoint observations, diversity metrics per trail segment
Territorial Information Points with QR/NFC anchors linking place to living species data
Shannon-Wiener, Simpson, Chao1, Pielou J, rarefaction curves, community matrices
Bulk upload, motion detection, SpeciesNet AI, individual animal identification
GBIF publisher, iNaturalist bridge, CONABIO SNIB, CONANP ANP, INAH biocultural
Community-trained Neotropical models with federated learning, Oaxaca-endemic focus
Conversational AI field assistant with RAG over regional species data
Augmented reality species overlay in camera viewfinder
# Core AI Pipeline
# Offline-First Architecture
Every observation is written to a local Dexie IndexedDB outbox first. On reconnect, the queue flushes to Supabase via REST POST. No data is lost without connectivity. Regional ONNX model packs are cached in IndexedDB for on-device inference via WebAssembly.