Hariprasad Bantwal

Principal Architect

Engineering Leader

Multi-Cloud, Data & AI

Capital Markets and Treasury Settlements, Payments, FIN Message, Data Center, SaaS

20+ years delivering enterprise platforms, cloud‑native architectures, and AI‑integrated solutions for global financial institutions. Multi‑cloud orchestration across Azure, AWS, GCP, and OCI. TOGAF‑certified, hands‑on, outcome‑driven. Leading engineering teams to deliver scalable, secure, and compliant solutions that drive business transformation.

CHF 10B+ daily processing $4T+ AUM platforms Zero‑downtime migrations Azure AWS OCI Data & AI Treasury & Capital Markets Payments DLT

Snapshot

Location
Zürich, Switzerland
LinkedIn
/in/paihari
GitHub
/paihari

Signature Projects

Challenge → Solution → Impact, with artifacts and stack.

2021–2024 · SaaS Core Banking

SaaS Core Banking Suite Cloud Transformation

Goal

Transform the SaaS Core Banking Suite architecture to leverage cloud-native capabilities, optimize workload placement across hybrid and multi-cloud environments, and deliver a standardized, automated, and compliant provisioning framework for 25+ banking clients.

Backstory

The existing SaaS platform was hosted entirely on a private cloud, requiring the integration of complex stacks — from servers to switches, routers, operating systems, and databases — in collaboration with multiple third-party providers.

The scale was immense: 1,700 databases, 30,000 virtual servers, and 5,000 physical servers. The monolithic Oracle-based Core Banking Suite spanned multiple banking domains, with business logic deeply embedded and tightly coupled to the data layer. Several attempts to re-platform the data stack to cloud DBaaS had failed due to technical dependencies and economic constraints.

Systems of Engagement for business services were layered on top via APIs and UI services, but lacked portability and flexibility to adopt hybrid or multi-cloud strategies.

Details & Artifacts

Solution:

  • Design data layer on Oracle Cloud’s Dedicated Region Cloud@Customer (DRCC) to maintain regulatory and performance compliance while introducing cloud-native capabilities.
  • Migrated container-based Systems of Engagement to each client’s preferred hyperscaler (Azure, AWS, OCI, or GCP).
  • Defined hybrid blueprints to optimize workload placement, service selection, and Infrastructure-as-Code strategies.
  • Designed and implemented Low Code Cloud Topology Automation Orchestrator for IaC authoring.
  • Established a Unified Data Model to abstract cloud-specific service variations, enabling standardized, modular provisioning.
  • End-to-end Azure Data solution ingesting on-prem Oracle Exadata data via ADF and GoldenGate into ADLS, transforming it with Databricks Delta Live Tables, and delivering governed analytics and ML insights through Power BI and Feature Store

Impact: Enabled consistent, automated, and secure cloud provisioning for 25+ banking clients managing over $4T AUM; reduced provisioning time from weeks to hours; improved compliance posture; and standardized hybrid cloud deployment patterns.

Stack: CAF, Python, Cloud SDKs (Azure, AWS, OCI, GCP), Low-code Windmill Platform, Terraform, Azure Data Factory, Datalake Gen2, Databricks, Power BI, REST, GraphQL, Oracle Exadata, Oracle GoldenGate, Kubernetes.

Artifacts: Hybrid Cloud Blueprint, Infrastructure-as-Code libraries, Cloud SDK provisioning framework, Data ingestion pipelines, Workload migration plan, Operational runbooks.

Cloud Transformation Hybrid Cloud Multi-Cloud Core Banking IaC Azure OCI
2015–2019 · Capital Markets & Settlements

One Treasury & OTC Secondary Markets Settlements Modernization

Goal

Establish a sustainable, compliant, and future-proof settlement and confirmation platform, reduce architectural complexity, and enable faster adoption of regulatory changes without operational disruption.

Backstory

The existing settlement functions were fragmented across multiple homegrown legacy applications and COTS applications, each catering to asset classes like Forex, Precious Metal, and Interest Rate Derivatives, with overlapping functionality and high maintenance costs.

Regulatory initiatives such as PRIIPs and MiFID were adding further complexity by requiring changes across multiple settlement engines. The platform lacked consistent DevOps practices, which slowed delivery and complicated testing and deployment.

Details & Artifacts

Solution: Designed and oversaw the implementation of the OTC Factory program — a central, standardized streaming-based Settlement Engine across multiple derivatives asset classes. Led integration and service interface management across ERP, CRM, Finance, Booking, Payments, SWIFT, and ECM domains. Built the core of a strategic confirmation platform with long-running production stability.

  • Data modelling with the central architecture for Settlement domain, wrapping the Cash Record and Stock Record.
  • Streaming adoption (Kafka), Spring Boot integration, and global service interface management.
  • Toolchain selection, DevOps adoption, and lifecycle management from adoption to retirement.
  • Executed a structured decommissioning program of Settlements, Payment and Recon Engines to reduce cost and complexity.
  • Introduced Perforce Delphix Data Virtualization tool, reducing data footprint and refresh time from PROD to UAT to DEV and TEST.

Impact: Lower run costs, faster environment refresh, improved resilience; enabled regulatory compliance at scale.

Stack: Kafka, Spring Boot, Java, Oracle, REST/JSON, CI/CD, DevOps practices.

Artifacts: Architecture patterns, Business Capability Map, proof-of-concept results, decommissioning roadmap, interface catalog.

Settlements Payments Capital Markets TOM Business Process Kafka Spring Boot Data Model Oracle
2013–2019 · Regulatory Data Platform

OTC Derivative Confirmation Data Platform

Goal

Build a centralized data platform for confirmations and regulatory reporting, ensuring EMIR-compliant transparency and risk management for OTC derivatives.

Backstory

Asset Management and Wealth Management operated multiple fragmented confirmation and reporting processes, with data scattered across trade capture systems, mainframe-based ERP/CRM, and downstream reporting tools.

Regulatory mandates such as EMIR and FINREG required timely, accurate, and auditable confirmations and trade reporting. The absence of a centralized confirmation data platform led to duplicated workflows, inconsistent data quality, and higher operational costs.

Details & Artifacts

Solution: Built a distributed, regulatory-compliant Confirmation Data Platform with an end-to-end trade capture pipeline (Wealth & Asset Management → master data from ERP/CRM → ECM, client, and regulatory reporting). Delivered EMIR/FINREG compliance, sustainable platform design, and acted as the single global contact for confirmation management. Integrated Confirmation function to BlackRock Aladdin ($22B AUM)

  • Centralized OTC and derivative confirmation and reporting into a unified platform for trade reporting, unauthorized trading detection, and MIS.
  • Designed high-availability architecture with J2EE, EJB, Oracle Cluster, and IBM MQ for reliable, scalable processing.
  • Implemented modern analytics and reporting pipelines using Apache Spark, Apache Iceberg, ClickHouse, and Tableau.
  • Introduced a reusable development framework and standardized service patterns for integration and reporting services.

Impact: Reduced operational complexity, improved regulatory compliance, and delivered a single source of truth for confirmations and reporting across asset classes.

Stack: J2EE, EJB, Oracle Cluster, Web Services, IBM MQ, Quartz Timer, SWIFT MT, ECM, Apache Spark, Apache Iceberg, ClickHouse, Tableau.

Artifacts: High- and low-level designs, conceptual and physical data models, solution architecture documentation, development framework.

Stakeholders: Group Operations Business, Legal Business Operations, IT application owners, Change Control Board, Infrastructure Platforms, Agile teams, implementation partners.

Confirmation Data Platform EMIR, FINREG PRIIPs, MiFID II Regulatory Reporting BlackRock Aladdin API Apache Spark Oracle Tableau Integration
2006–2013 · Payments, Accounts, Booking

High-Volume Payments Platform Modernization

Goal

Modernize the Swiss payments engine to comply with ISO 20022 standards, scale to millions of daily transactions, and replace monolithic DB2-based architecture with a distributed, resilient payments processing platform.

Backstory

The Swiss payment engine processed over 3 million payments per day using a monolithic DB2 setup. With the introduction of ISO 20022 SWIFT message regulations, a complete redesign was required to ensure compatibility, improve maintainability, and support higher throughput.

As part of the modernization, proprietary message formats needed to be mapped to ISO 20022 standards such as PAIN.001 and PAIN.003 for both inbound and outbound payments, with strict adherence to regulatory requirements.

Additionally, a distributed payments engine was developed to improve performance and resiliency, supported by an operational data store designed entirely on ISO 20022 structures.

Details & Artifacts

Solution:

  • Led engineering for the Payments Stream of the MyShop Project, focusing on ISO 20022 payment message formats (PACS, PAIN, CAMT).
  • Developed transformation components to map ISO 20022 messages to Credit Suisse’s proprietary formats and vice versa.
  • Served as IT application owner for the first distributed payments engine integrating ISO 20022 messages into operational workflows.
  • Designed and implemented an operational data store based on ISO 20022 message schemas for real-time processing and reporting.
  • Coordinated with implementation partners for code reviews, testing, and quality assurance.

Impact: Delivered the first distributed ISO 20022-compliant payments engine at Credit Suisse, enabling regulatory compliance, reducing operational bottlenecks, and improving throughput for millions of daily payments.

Stack: J2EE, EJB, Web Services, Oracle Cluster, IBM MQ, Service Gateway.

Artifacts: High-Level Design (HLD), Low-Level Design (LLD), requirements documentation, framework code for database, web service, and MQ integration.

Payments ISO 20022 Integration Distributed Systems IBM MQ Oracle
2024 · AI & HYPERSCALER GOVERNANCE

Any Cloud GenAI Orchestration Platform

Goal

Build a policy-driven AI orchestration platform spanning Azure, AWS, GCP, and OCI; embed SecOps (NIST/OSCAL) FinOps controls and Project Management directly into infrastructure/data pipelines; enable agentic workflows that execute safely and consistently across heterogeneous clouds.

Backstory

Enterprises struggle with fragmented tooling, siloed dashboards, and inconsistent policy enforcement across multiple cloud providers. Applying unified security, compliance, and cost controls across clouds often results in duplicated effort, higher operational costs, and visibility gaps.

syntropAI’s AI Cloud Hub was envisioned to provide a unified, language-driven control plane — replacing scripts, static templates, and ad-hoc automation with a natural-language interface backed by policy-as-code, real-time FinOps intelligence, and auditable governance.

Details & Artifacts

Challenge: Centralize AI governance, FinOps insight, and SecOps enforcement across heterogeneous multi-cloud environments without vendor lock-in or operational silos.

Solution: Policy-as-code orchestration with agentic execution (OpenAI SDK, LangGraph, CrewAI, MCP); NIST/OSCAL control mappings embedded in pipelines; FinOps-aware actions with live tagging and cost checks; multi-lingual operations and approvals captured as an auditable graph.

  • Governed provisioning: one-sentence requests deploy across clouds with automatic policy checks.
  • Autonomous troubleshoot: no playbooks — instruct, fix, and record the trail.
  • Approvals & exceptions: built-in dialogue to escalate, approve, and proceed with traceability.
  • Cost intelligence: real-time cloud cost insights before the invoice arrives.
  • SecOps reporting: auto-generate NIST-aligned reports from live state across clouds.

Impact: Scalable, auditable AI adoption with reduced duplication, fewer misconfigurations, faster governed delivery, and cost-aware deployments.

Stack: OpenAI, Azure AI Foundry, Azure, AWS, OCI, GCP, Graph DB, Terraform, OSCAL, MCP (servers/clients), Python, TypeScript.

Artifacts: AI Cloud Hub reference architecture; policy-as-code libraries (OSCAL mappings); FinOps dashboards & tagging schema; demo notebooks & runbooks; recorded live demos (provisioning, governance, cost, reports).

Live Demos:

Agentic AI GenAI Multi-cloud Governance FinOps SecOps Policy-as-Code Project Management
2020–2022 · Platform Engineering

Datacenter Migration for 25+ Banking Clients

Goal

Relocate all infrastructure and workloads for 25 SaaS banking clients to a new co-host datacenter in Zürich with a disaster recovery link to Ticino — ensuring zero downtime, regulatory compliance, and full operational continuity during the migration.

Backstory

The co-host datacenter housing infrastructure for 25 banking SaaS clients (including Edmond de Rothschild, HSBC, Deutsche Bank, and multiple Cantonal banks) was scheduled for closure, with advanced notice issued to vacate the site. The initial strategy to migrate workloads directly to hyperscaler cloud environments proved unfeasible due to the complexity of the multi-tenant setup and coordination challenges among stakeholders.

A new plan was formed to migrate all infrastructure and workloads to a newly identified co-host datacenter in Zürich, with a disaster recovery setup in Ticino, while ensuring uninterrupted banking operations.

Details & Artifacts

Solution:

  • Owned overall migration design and strategy for all clients, covering AIX and x86 workloads.
  • Led end-to-end stakeholder coordination — clients, network providers, procurement, CISO, Data Officers, and program management teams.
  • Established agile migration teams with architects and SMEs for compute, network, database, and observability.
  • Directed sprint planning, work package creation, progress tracking, and risk reporting.
  • Secured dedicated, high-availability migration and operations network lines between Zürich and Ticino.
  • Produced solution documentation, runbooks, and project plans for 150+ migration packages.
  • Oversaw hiring, procurement, approvals, and execution of migration tasks.

Impact: Successfully migrated 1,480 servers (AIX & x86) for 25 banks, executed 233,881 test cases, and vacated the original datacenter with zero downtime or regulatory breach.

Stack / Domains: AIX, x86 virtualization, private cloud, high-availability networking, DR replication, observability tooling.

Artifacts: Solution architecture documents, detailed runbooks, project migration plan, sprint boards, risk registers, compliance approval records.

Cloud Migration Resilience AIX x86 Banking
2025 · AI & FINANCIAL Compliance

Dynamic AML Detection Platform

Goal

Build a real-time Anti-Money Laundering detection platform with live sanctions screening, reducing false positive alerts by 60% while processing millions of transactions daily through advanced rule-based algorithms and multi-provider authentication.

Backstory

Traditional AML systems generate over 90% false positives, overwhelming compliance teams and creating regulatory risks. Rule-based systems couldn't adapt to evolving money laundering patterns, and transaction monitoring relied on batch processing with 24-hour delays.

The challenge was to build a cloud-first platform that could process high-velocity transactions, integrate live sanctions data from multiple sources (OpenSanctions API, OFAC lists), and provide real-time monitoring with explainable decision rationale for regulatory compliance.

The system needed enterprise-grade authentication across multiple identity providers and seamless failover between cloud (Supabase) and local (SQLite) storage for maximum reliability.

Details & Artifacts

Solution:

  • Architected cloud-first platform using Supabase PostgreSQL with automatic SQLite fallback for 100% availability.
  • Implemented 5 real-time detection rules: sanctions screening, geography risk analysis, structuring detection, velocity anomalies, and round-trip patterns.
  • Built multi-provider OAuth2 authentication system supporting Google, Microsoft, GitHub, and Oracle Cloud identity providers.
  • Created dynamic transaction generator producing realistic test data with configurable risk profiles for compliance testing.
  • Designed RESTful API with role-based access control and comprehensive audit logging for regulatory compliance.
  • Developed interactive web dashboard with real-time visualizations, alert management, and creative system health indicators.
  • Integrated live sanctions data processing with 1.2M+ records from OpenSanctions API and OFAC datasets.

Impact: Reduced false positive alerts by 65%, achieved sub-second transaction processing, maintained 99.9% system uptime, and demonstrated regulatory compliance with zero audit findings during examinations.

Stack: Python, Flask, Supabase PostgreSQL, SQLite, OpenAI SDK, OAuth2 (Authlib), Redis, Chart.js, OpenSanctions API, Docker, Render.

Artifacts: Live demo platform, C4 architecture diagrams, authentication implementation guide, API documentation, detection rules specification, regulatory compliance framework.

AI/ML Anti-Money Laundering Real-time Processing OAuth2 Authentication Cloud-First Regulatory Compliance API Development
2025 · AI and Energy decisions

ETH Swiss Energy Scenarios Decipher System

Goal

Develop a sophisticated multi-agent AI platform that transforms complex energy scenario data from Switzerland's Energy Perspectives 2050+ into accessible insights for citizens, journalists, students, and policymakers through intelligent natural language processing.

Backstory

Switzerland's Energy Perspectives 2050+ dataset contains vast, complex information across 103 data files (87 CSV + 16 PDF reports) that's difficult for non-experts to interpret. Citizens, journalists, and policymakers needed quick, reliable answers about energy transition scenarios, but language barriers limited accessibility across Switzerland's multilingual population.

The challenge was to make this critical energy transition data transparent and understandable, enabling informed public discourse about Switzerland's path to Net-Zero 2050 while maintaining scientific accuracy and supporting multiple languages and user types.

Traditional static reports couldn't provide the interactive, user-adaptive insights needed for effective energy policy communication and citizen engagement.

Details & Artifacts

Solution:

  • Architected 6-agent collaborative AI system: Orchestrator, Data Interpreter, Scenario Analyst, Document Intelligence, Policy Context, and Language Translator agents.
  • Processed comprehensive Swiss Federal Office of Energy dataset covering demographics, emissions, energy consumption, electricity generation, and economic projections (2000-2060).
  • Implemented ZERO-Basis vs WWB scenario comparison engine for climate policy analysis and implementation pathway assessment.
  • Built multilingual support system (English, German, French, Italian) with automatic language detection and Swiss energy terminology preservation.
  • Created user-adaptive interface tailoring responses for citizens, journalists, students, and policymakers with appropriate complexity and focus.
  • Developed both Streamlit web interface and CLI for different user preferences and deployment scenarios.
  • Integrated advanced document processing using ChromaDB vector database and sentence transformers for semantic search across technical reports.

Impact: Democratized access to complex energy scenario data, enabled evidence-based energy policy discussions, and provided multilingual accessibility to Switzerland's energy transition planning for diverse stakeholder groups.

Stack: Python, OpenAI GPT-4, LangChain, ChromaDB, Sentence Transformers, Streamlit, Pandas, Matplotlib, Plotly, PyPDF2, AsyncIO.

Artifacts: Multi-agent system architecture, 103-file dataset processor, multilingual query interface, scenario analysis framework, ETH research documentation.

Multi-Agent AI Energy Analytics ETH Zurich Multilingual NLP Climate Policy Data Science Vector Databases
2024 · Legal AI Challenge

Legal Intelligence Multi-Agent System

Goal

Design a specialized 3-agent LLM system for target company and law firm identification in legal documents, providing structured JSON analysis for legal professionals with high accuracy and reliable entity extraction capabilities.

Backstory

Legal professionals need to quickly identify key entities (target companies, law firms) from complex legal documents and M&A agreements. Manual analysis is time-consuming and prone to human error, while traditional NLP systems lack the domain-specific understanding required for accurate legal entity extraction.

The challenge required building a system that could handle nuanced legal queries, distinguish between buyer/seller/third-party legal representation, and provide structured output suitable for legal workflow integration.

The system needed to handle edge cases where queries might be irrelevant to legal entity extraction, requiring intelligent query validation before processing.

Details & Artifacts

Solution:

  • Designed sequential 3-agent architecture: LLM1 (Target Detection), LLM2 (Legal Entity Analysis), LLM3 (JSON Compilation) using GPT-4o-mini at temperature 0.2 for consistency.
  • Implemented intelligent query validation with early stopping mechanism when queries are irrelevant to legal entity extraction tasks.
  • Built comprehensive legal entity extraction engine identifying buyer firms, seller firms, and third-party legal representation from document paragraphs.
  • Created structured JSON output system with validated fields and proper error handling for missing or ambiguous entity information.
  • Developed robust orchestrator managing the complete workflow with paragraph validation, result compilation, and comprehensive error handling.
  • Implemented extensive test suite covering individual agent functionality, integration testing, and edge case validation.
  • Built Pydantic-based data models ensuring type safety and structured output validation throughout the processing pipeline.

Impact: Delivered accurate legal entity extraction with structured JSON output, reduced manual analysis time, and provided reliable automation for legal document processing workflows.

Stack: Python, OpenAI GPT-4o-mini, Pydantic, pytest, JSON validation, multi-agent orchestration framework.

Artifacts: Multi-agent system architecture, comprehensive test suite, system prompts specification, JSON schema validation, legal entity extraction framework.

Legal AI Multi-Agent Systems Entity Extraction LLM Orchestration JSON APIs Legal Tech Document Analysis

Capabilities Matrix

AI & Emerging Tech

  • OpenAI SDK, Azure AI Foundry, LangGraph, CrewAI, MCP
  • Vector Databases: ChromaDB, Pinecone, Weaviate
  • Document Intelligence: PyPDF2, LangChain, LlamaIndex
  • Agentic AI: multi-agent systems, autonomous workflows
  • DLT: Principles for Securities Industry (ISSA) WG

Enginering, Cloud & Infrastructure

  • Azure, AWS, GCP, OCI
  • Kubernetes, Docker, Terraform / IaC
  • DevOps, FinOps, SecOps (NIST OSCAL)
  • J2EE, Spring, Python, TypeScript, SQL, Oracle

Enterprise & Data Architecture

  • TOGAF methodology
  • Regulatory‑compliant data platforms
  • Integration: REST, SOAP, GraphQL, Kafka, event‑driven
  • Data Modeling, Oracle, Postgres, Graph

Financial Services Domain

  • Capital Markets: trade capture, settlements, confirmations
  • Payments: SIC, SWIFT, SEPA, ISO 20022
  • Wealth Management & regulatory frameworks

Industry Recognition & Thought Leadership

Publications

Academic & Community

Certifications & Credentials

Enterprise Architecture

  • TOGAF 9 Certified

Education & Languages

  • B.E., Mechanical, University of Mysore (2000)
  • Languages: English, German

Contact

I’m open to roles in Cloud Architecture, AI Transformation, Platform Engineering, Solution and Data Architecture. For opportunities and consulting, please reach out by email or LinkedIn.

LinkedIn
/in/paihari
GitHub
/paihari

Chat with My AI‑Insight

Ask anything about your my profile. Open in new tab .
If you need further info, Contact through Email or LinkedIn.