We can't find the internet
Attempting to reconnect
Something went wrong!
Attempting to reconnect
$ cat /home/ayo/profile.txt
Loading professional profile...
AYO MOSANYA
Austin, TX • mosanyaayo@gmail.com
I solve business problems by applying research-driven methodologies across domains and combining their strengths in the data analytics space. Currently at Charles Schwab, I reverse-engineer existing systems to redesign scalable data processes, design visuals that lower cognitive load, and automate the tedious parts of reporting—extraction, transformation, validation, and visualization—using AI.
I study frameworks for solving problems at scale, stay current on development news, and build projects in my free time to learn. I've built websites from scratch and deployed them myself, understanding advanced DNS setup and language-specific deployment strategies.
Value Estimation Methodology
This value estimate follows standard industry practice for calculating ROI on automation and efficiency initiatives. The methodology uses conservative assumptions and is based on measurable outputs.
Methodology: Follows standard IT ROI calculation frameworks (Gartner, Forrester TEI)
Assumptions: 2,080 work hours/year, financial services industry rates, Fortune 500 scale
Verification: Time savings validated through before/after process measurements
EXPERIENCE.log
- [01] Reverse-engineered existing Tableau dashboards to redesign data processes for scalability, reducing cognitive load through cleaner visual hierarchy and updated metrics that tell a clearer story
- [02] Automated the most tedious parts of reporting—data extraction, transformation, validation, and visualization—using AI-augmented development workflows
- [03] Saved hundreds of work hours by building Python automation using oracledb, async, polars, duckdb, io, pytz for concurrent data processing and transformation
- [04] Designed reusable code utilities for Excel report generation, PDF generation, and documentation automation using pandoc and tectonic
- [05] Delivered compliant responses to Federal Reserve Board examination inquiries with zero remediation costs, serving as technical authority for regulatory data analysis
- [06] Mentored Risk Specialists in SQL, Python, and Tableau, accelerating their career progression to Senior roles through hands-on technical guidance
- ▸ Drove measurable cost reductions for Fortune 500 clients, by engineering automated cloud spend analysis models using advanced Excel formulas for dynamic reporting
- ▸ Identified cost optimization opportunities through weekly technical recommendations, by analyzing AWS Redshift cloud usage patterns and infrastructure metrics
- ▸ Streamlined cross-platform cloud computing analysis and reporting, by developing consolidated data frameworks for multi-million dollar infrastructure investments
- ▸ Enhanced M&A technology assessments and reduced manual effort by 60%, by designing multi-platform reporting solutions using ServiceNow, Salesforce CRM, and Tableau Desktop
- ▸ Ensured alignment between technical implementations and business requirements, by implementing quality assurance protocols for acquisition integrations
- ▸ Achieved successful system adoption by acquired companies, by developing and facilitating technical training programs for post-close integration
Achieved top 5% global ranking for customer satisfaction, by leading technical problem-solving initiatives and delivering data-driven recommendations to optimize operations
Increased customer engagement and loyalty, by analyzing customer needs and delivering targeted technical solutions through data-informed consultation
ERM Horizontal Reporting
Built an interconnected reporting system that tells the full risk story by showing how processes, risks, controls, and issues relate through aggregation.
The Challenge
- ✗ Millions of rows across interconnected tables—processes, risks, controls, issues
- ✗ Single giant SQL query would take 45+ minutes and timeout
- ✗ Traditional reporting showed siloed data, not the connected story
- ✗ Executives needed aggregations showing how risk cascades affect each other
✓ The Solution
- ✓ ETL Toolkit + asyncio: Parallel workers running multiple focused queries
- ✓ Python data joining: Connect results in memory after parallel extraction
- ✓ DuckDB aggregations: Lightning-fast in-memory analytics on combined dataset
- ✓ Result: Full horizontal report in 3 minutes vs. 45+ minutes
SKILLS.conf
B.S., Computer Information Systems
University of Texas at Tyler
PERSONAL_DEV.log
● MODULAR FRAMEWORKA modular directory of reusable Python scripts that forms an auditable, recreatable development framework. This system creates a shareable foundation that lowers the barrier to custom code development—if you understand looking up documentation and finding the proper libraries, you can solve business problems efficiently.
The Framework Philosophy
1. Understand the business problem — What are we actually trying to solve?
2. Identify impacts & executive outcomes — What does success look like to leadership?
3. Ask the right discovery questions — And understand why you're asking them
4. Think through various solutions — Don't commit to the first idea
5. Measure solution impacts — Which approach best fits the constraints?
6. Estimate downstream impact — Notify involved parties before changes
7. Automate repeated work — Eliminate time sinks systematically
etl_toolkit/
├── connections/ — OracleDB connectors, connection pooling
├── extraction/ — Multi-SQL file execution, batch queries
├── analysis/ — Aggregate counts, data profiling, analytics
├── excel/ — Creation, formatting, export utilities
├── validation/ — Extract comparison, count checks, auditing
├── visualization/ — D3.js mockups, Plotly charts, AI iteration
├── datetime/ — Date organization, fiscal calendars, scheduling
└── cli/ — One-command report runner
Data Connectivity
- ▸ OracleDB connection management
- ▸ Multi-.sql file batch execution
- ▸ Parameterized query templates
- ▸ Connection pooling & retry logic
Analytics & Reporting
- ▸ Aggregate count compilation
- ▸ Data extract profiling
- ▸ Excel creation & formatting
- ▸ Automated export pipelines
Validation & Audit
- ▸ Extract-to-code aggregation comparison
- ▸ Count validation checks
- ▸ Discrepancy flagging
- ▸ Audit trail generation
Visualization & Prototyping
- ▸ D3.js chart mockups
- ▸ Plotly interactive dashboards
- ▸ AI-assisted rapid iteration
- ▸ Stakeholder preview generation
# Run any configured report with a single command
$ etl run weekly-account-summary
→ Connects to Oracle
→ Executes 12 SQL files
→ Compiles aggregates
→ Validates against prior extract
→ Generates formatted Excel
→ Exports to shared drive
✓ Complete in 3 minutes (was 45 min manual)
Business Impact
8+ hours saved weekly through automation of repeated extraction, validation, and reporting workflows.
Every process is documented, version-controlled, and recreatable. No more "how did we calculate this last quarter?"
Team members can solve new problems by composing existing modules rather than starting from scratch.
Skills Demonstrated
Framework Design • Modular Architecture • Database Connectivity • ETL Pipeline Development • Data Validation • Excel Automation • CLI Tool Development • Process Documentation • Team Enablement • Visualization Prototyping • AI-Assisted Development
Research-Driven Development
I study frameworks and methodologies from diverse domains—functional programming, fault-tolerant systems, regulatory compliance, quantitative finance—and apply them to solve data analytics problems in novel ways.
Build to Learn
Every project is a learning laboratory. TabMine taught me hybrid architectures. FRAP taught me regulatory compliance engineering. TradingAlgo taught me iterative version evolution. I ship to understand.
AI as Force Multiplier
I use AI tools (Copilot, Claude, ChatGPT) not as crutches but as accelerators. I understand what I'm building and use AI to move faster, not to think for me.
Polyglot by Design
Python for ML/analytics. Elixir for concurrency and fault tolerance. Each language for its strengths. I build bridges between ecosystems rather than forcing one tool to do everything.
Continuous Learning
I read development news daily, study new frameworks, and build side projects to explore technologies before I need them professionally. Learning is not separate from work—it is the work.
🚢 Ship & Iterate
TradingAlgo went through 4 major versions. Each taught me what the next needed. I believe in shipping imperfect systems and improving them based on real usage, not endless planning.
READ MY INSIGHTS
Exploring data analytics, enterprise risk management, and the future of financial technology.
ENTER BLOG →