DigitalX is an ASX-listed digital asset manager offering funds and a spot Bitcoin ETF. We partnered to design and ship an investment analytics product that brings fragmented data into one place for investment research work.
Objectives & outcomes
DigitalX engaged us to deliver a production-grade platform for fund operations: consolidate on-chain, digital asset data; standardise metrics and calculations; auto-generate recurring factsheets and dashboards; and enable safe AI-assisted Q&A across digital assets, economic data and semantic research notes.
Key outcomes targeted:
- •A governed data flow connecting digital asset data, investment research and public datasets.
- •AI search functionality, live dashboards, and ad-hoc report building capabilities.
Context, constraints & success criteria
The platform needed to serve investment analysts with different goals, fields, levels of research. Datasets ranged from API data, to on-chain transactions and internal pricing and risk criteria.
Because the digital asset environment shifts quickly, the solution had to support swap-in models, new data vendors and evolving compliance language without rewiring pipelines. We validated approaches across trained models, retrieval-augmented generation (RAG), and structured prompting with an evaluation harness tracking precision, grounding and latency.
Approach
Engagement model
Discovery with portfolio teams and investment analysts
We mapped digital asset data types → investment research report flows end-to-end, and understood the manual processes engaged in by analysts. We learned what the best use of analyst time was, and where their tasks could be abstracted away.
Build & phases
Four coordinated tracks delivered quick analytics for the high priority work, while laying a scalable foundation.
Discovery
We built a report that covered the practical benefits of analytics, data integration, AI and automation for the investment research use-cases that DigitalX were experiencing. We estimated ROI and cost savings for abstracting away manual data collection tasks.
- Catalogued sources (on-chain, market, registry, research).
- Define evaluation criteria (accuracy, freshness, explainability, latency, cost).
- Baseline current state: lineage, data quality, reporting gaps and duplication.
Design
Target architecture, standards and operating model for multi-product reporting.
- Cloud data platform; event/object schemas; models & tests; AI layer.
- Governance: RBAC, cost guardrails.
- AI patterns: RAG anomaly detection, templated narratives.
Delivery
Building for the high priority work, then scaling out to additional datasets and use cases.
- Priority datasets ingested and modeled; quality tests and lineage in place.
- Publish governed metrics; generate baseline factsheets and investor updates.
- Implement AI Q&A with citations.
Iteration
Scale to additional datasets and use cases.
- Idempotent ingestion; retries; backfilling; change-data-capture where available.
- Evaluation/backtesting; red-team prompts; prompt-library versioning.
- Documentation for transparency and auditability.
Results
- Enhanced asset selection capabilities for speed and breadth of selection.
- Dashboards for key metrics and trends.
- AI-assisted Q&A for semantic discovery with source-level citations.
- Less manual collation and fewer one-off spreadsheets across research teams.
Solution overview
Data platform. Cloud warehouse as the analytical system of record; ingestion; models with automated tests; lineage capture; freshness and cost monitors. A governed AI layer exposes key metrics for search.
AI-enhanced analysis. templated research dashboards displaying key standardised metrics and trends across different data sources.
Custom web app to house the analytics and AI. Role-based access control (RBAC) ensured each user only sees the data and actions appropriate to their role. Built with Next.js, the dashboard is fast, secure, and includes custom chatbots for guided analysis.
Notable benefits
What this means for investment & operations teams:
- Reliable numbers: consistent definitions with tests and lineage embedded.
- Faster analysis: analysts evaluate hypotheses instead of wrangling data.
- Explainable AI: every AI-assisted answer includes citations and guardrails.
- Elastic scale: new datasets and vendors plug into standards easily.
- Security by design: RBAC ensures least-privilege access.
- Lower total cost: Automation of tasks reduces manual hours spent on data collection.
AI is an accelerant, not a substitute. In this context it helps domain experts see more, sooner.
The Kali Software team



