Analytics Leader  Β·  Louisville, KY

Thirteen years building data systems that actually get used.

I'm Dustin Cole. I lead meter analytics at LG&E, where I manage the infrastructure behind 700 million interval rows a day. I build predictive models, migrate data ecosystems, and design reporting tools that help people make real decisions.

700M+
Interval rows / day
100+
Power BI reports deployed
13+
Years of analytics
5+
Industries & domains
Louisville, KY  Β·  US Central

I build data systems that help people make decisions they can actually trust.

My career started in operations. I was supervising accounts payable at Baptist Health, consolidating five hospital departments, migrating finance systems, and at some point started building dashboards for the executive team. That part stuck.

From there it was healthcare supply chain analytics at Baptist, then LG&E for the last several years -- working my way from spend analytics and contractor cost modeling up through building the full meter data infrastructure that the utility runs on today. I've worked across Oracle, Qlik, R, Python, Power BI, and Snowflake. The thread through all of it is the same: figure out what the business actually needs to see, then build the thing that shows it.

The work I'm most proud of tends to be the kind that gets used every day by people who have no idea how it works, and that's exactly how it should be. A predictive model for meter overheating that operators rely on each summer. A drill-down dashboard ecosystem that goes from system-level status down to an individual meter's interval history. A Snowflake migration that moved 700 million rows a day off aging on-prem infrastructure and into something the whole organization can build on.

Outside of work I'm finishing a Master's in Data Science at Indiana University, writing a weekly newsletter called The Data Nerve, and learning what it means to be a dad to a one-year-old who has better things to do than look at my dashboards.

The best data work is invisible. It runs in the background, gives people confidence in the numbers, and gets out of the way.

DC
Dustin Cole
Supervisor, Meter Analytics (AMI) — LG&E
Open to opportunities
Location Louisville, KY
Experience 13+ years
LinkedIn dustinwcole
Education
M.S. Data Science
Indiana University Bloomington
In Progress · 2027
B.S. Accounting
Foundation in finance, operations, and business systems
Certifications
Google Advanced Analytics
IBM Data Science
AWS Cloud Computing
DataCamp Data Scientist
DataCamp Data Engineer

Thirteen years of data work, across five roles that kept getting harder.

Click any role to see the full story β€” what was built, what the stakes were, and what it taught me.

2011 β€” 2014

Baptist Health

Healthcare

Supervisor, Accounts Payable

PeopleSoft Oracle Cloud Analytics AnyDoc OCR Excel KPI Dashboards

This is where data clicked for me. I was brought in to consolidate AP departments across five hospitals and all the doctor offices into a single system. That meant a full migration to PeopleSoft for finance and supply chain, setting up AnyDoc for OCR invoice processing, and rebuilding how payment workflows were tracked and approved.

Once things were running, I built executive KPI dashboards in Oracle Cloud Analytics pulling AP data from across the entire health system. That was the moment I realized I cared more about the dashboards than the AP work. The whole rest of my career has been chasing that feeling.

This role is where the interest in analytics started. Everything after it was intentional.

2014 β€” 2016

Baptist Health Supply Chain

Healthcare / Supply Chain

Spend Analytics

Qlik Excel VBA Value Analysis Physician Preference Items

Moved into a dedicated analytics role focused on supply chain spend. The core work was physician preference items and value analysis automation β€” which sounds dry until you're in a room with surgeons and hospital executives debating whether to spend more on a specific type of suture because one doctor prefers it.

The range was genuinely wild. Toilet paper to MRI machines. I worked alongside medical councils to help them balance quality, value, and cost with actual data behind the decisions, not gut instinct. Qlik was the primary BI tool here, and I leaned heavily on Excel VBA to automate the parts that kept eating hours every week.

This is where I learned that good analysis has to connect to a decision someone is actually going to make. If it doesn't change behavior, it doesn't matter.

2016 β€” 2019

Louisville Gas & Electric

Utilities

Senior Analyst, Supply Market & Inventory Analytics

TRAC Oracle Excel VBA Scenario Modeling BPA Analysis

First role at LG&E, moving into utilities. I administered the TRAC contractor time-tracking system and managed approval workflows for contractor labor at power plants, all the way through to Oracle finance payments. The analysis side covered blanket purchase agreements, purchase agreements, and invoice spend for commercial operations across all plants.

Then COVID hit. Plant managers suddenly needed to make decisions about how many contractors to keep on site under different shutdown scenarios β€” and there was no tool to help them think through the costs. I built a sequestration model in Excel VBA that modeled cost outcomes across multiple plant configurations.

5+ plants received the model β€” used by managers to make real staffing and cost decisions during COVID

2019 β€” 2022

Louisville Gas & Electric

Utilities / Analytics

Senior Meter Analyst, AMI Analytics

R SSMS Power BI ESRI ETL Pipelines Poisson Regression Monte Carlo weather.gov API

This is the role where everything accelerated. I built ETL pipelines using R scripts triggered in SSMS, running on a VM I helped configure. Then came the Power BI ecosystem β€” pulling from seven-plus systems and APIs β€” built from scratch and grown into something the operations team actually relied on daily.

The flagship dashboard drilled from system-level meter communication health down to a single meter, then out to an ESRI map showing that meter and its neighbors, then into Google Maps, then into paginated interval data reports showing every reading β€” gaps, overheating events, threshold proximity, full history. One tool, one continuous drill-down from the grid to a single device.

On the modeling side, I built a Poisson regression model to forecast high-temperature meter events. Inputs: meter population, cooling degree days, weather year methodology (historical same-day temps across 20 years), and prior high-temp events. Then added Monte Carlo simulation and a live weather.gov API integration to predict overheating risk 7 to 10 days out with confidence ranges.

7+ systems feeding one unified Power BI ecosystem
10-day meter failure forecasts with live weather data and confidence intervals

2022 β€” Present

Louisville Gas & Electric

Current

Supervisor, Meter Analytics & Meter Operations

Snowflake Snowflake Cortex AI Python Power BI LLM Integration Semantic Models Make.com Twilio VAPI

Leading analytics and operations for the entire AMI metering program. The first big lift was a full cloud migration β€” moving roughly 700 million interval rows per day from on-premise infrastructure to Snowflake. The entire data ecosystem followed: pipelines, 100-plus Power BI reports, governance, all of it.

The current focus is on AI integration. I've built Python pipelines using LLMs and Snowflake Cortex AI so operational users can ask questions in plain English and get back SQL-generated charts and tables β€” no query knowledge required. Semantic models define the business logic underneath. I also piloted Copilot in Power BI for graph development with the team.

Outside of the core job, I've built personal AI agents using Twilio, Make.com, VAPI, and Formspree, built websites, and am actively integrating AI tools into every workflow I touch. Daily user of Claude, ChatGPT, and a rotating set of other LLM tools. This isn't exploration β€” it's how I work.

700M+ interval rows migrated per day to Snowflake
100+ Power BI reports deployed on Snowflake infrastructure

Thirteen years of tools, methods, and things that actually shipped.

Every skill here has a project behind it. Hover any tag to see where it was applied. The dot shows depth: teal is expert, blue is proficient, gold is applied.

Technical 12 skills
Python Used inLLM pipelines, Cortex AI, agent building SQL Used inEvery role β€” SSMS, Snowflake, Oracle R Used inETL pipelines, Poisson regression, Monte Carlo ETL Pipeline Design Used inAMI Analytics, Snowflake Migration Excel VBA Used inCOVID sequestration model, supply chain automation SSMS Used inAMI Analytics β€” triggered R scripts on VM Web Scraping Used inData collection, weather API integrations GitHub Used inVersion control, portfolio in development PeopleSoft Used inBaptist Health AP consolidation Oracle Used inLG&E finance payments, contractor workflows API Integration Used inweather.gov API, Google Maps, Power BI APIs AnyDoc OCR Used inBaptist Health invoice processing
Data Science 11 skills
Predictive Modeling Used inMeter Health Forecasting Model Poisson Regression Used inHigh-temperature meter event forecasting Monte Carlo Simulation Used inMeter failure confidence ranges, COVID cost model Time Series Forecasting Used inWeather year methodology, 20-year temp baseline Anomaly Detection Used inMeter interval data β€” gaps, overheating, thresholds Feature Engineering Used inMeter forecasting inputs β€” CDD, population, events Statistical Analysis Used inSpend analytics, value analysis, cost modeling Exploratory Data Analysis Used inEvery project before modeling begins Regression Analysis Used inMeter event modeling, spend analysis Machine Learning Used inM.S. Data Science coursework + applied modeling Clustering Used inSegmentation work, operational grouping
AI & Automation 9 skills
LLM Integration Used inPython pipelines, Snowflake Cortex, NL-to-SQL Prompt Engineering Used inDaily β€” Claude, ChatGPT, production pipelines Snowflake Cortex AI Used inNatural language self-service analytics Natural Language to SQL Used inSemantic models, Cortex AI query layer Agent Building Used inPersonal AI agents β€” Twilio + Make.com + VAPI Make.com Automation Used inAI agent workflows, automated pipelines Semantic Modeling Used inSnowflake semantic layer for business users Copilot (Power BI) Used inPiloted for graph development with team VAPI / Twilio / Formspree Used inAI agent stack β€” voice, messaging, forms
BI & Visualization 8 skills
Power BI Used inAMI ecosystem, 100+ reports on Snowflake DAX Used inComplex measures, time intelligence, KPIs Power Query (M) Used inData transformations, multi-source joins Paginated Reports Used inInterval data reports β€” all readings per meter ESRI Maps Used inMeter geolocation drill-down in Power BI Qlik Used inBaptist Health Supply Chain spend analytics Oracle Cloud Analytics Used inBaptist Health executive KPI dashboards Tableau Used inSupplemental visualization, prototyping
Cloud & Infrastructure 6 skills
Snowflake Used in700M+ rows/day migration, full ecosystem Snowflake Architecture Used inWarehouse design, data governance, pipelines AWS Used inCloud Computing Certification + applied work VM Administration Used inHelped set up & manage VM for R/SSMS pipelines Data Governance Used inSnowflake migration β€” access, lineage, standards Webflow Used inBuilt dustincoledata.com portfolio site
Depth Expert β€” daily use, production-grade work Proficient β€” shipped real projects with this Applied β€” used it, know it, not my daily driver

Eight projects that moved numbers that mattered to real operations.

These aren't portfolio demos built for a GitHub audience. They're tools that plant managers made decisions with, operators checked every summer morning, and a whole utility runs on today.

02
Predictive Model

Meter Health Forecasting β€” Poisson Regression + Monte Carlo + Live Weather

Built to predict meter overheating events 7 to 10 days out. Inputs: meter population, cooling degree days, a 20-year weather baseline built using weather year methodology, and prior high-temp events. Added Monte Carlo simulation for confidence ranges, then wired in a live weather.gov API so forecasts updated automatically with real forecasts. Operators use this every summer to decide where to stage crews.

Impact: Proactive crew staging before failure events, not reactive scrambles after them.
R Poisson Regression Monte Carlo weather.gov API Time Series
03
BI / Dashboard

Power BI AMI Dashboard Ecosystem β€” System Level to Individual Meter

A drill-down dashboard that starts at the full meter network and goes all the way down to a single device. System communication health, individual meter status, ESRI map with geolocation of that meter and its neighbors, click-out to Google Maps, then into paginated interval reports showing every reading β€” gaps, overheating events, threshold proximity, full history. One tool, one continuous path. Seven-plus source systems feeding it.

Impact: Replaced a half-dozen disconnected tools. Operations team now diagnoses meter issues in minutes instead of hours.
Power BI DAX ESRI Google Maps API Paginated Reports R
04
AI / LLM

Natural Language to SQL β€” Snowflake Cortex AI Self-Service Analytics

Built Python pipelines using LLMs and Snowflake Cortex AI so operational users can ask questions in plain English and get back SQL-generated charts and tables β€” no query knowledge required. Semantic models define the business logic underneath so the language layer knows what the data actually means. Users who've never written a line of SQL are pulling operational reports on their own.

Impact: Reduced analyst time on routine data pulls. Non-technical stakeholders self-serve on data they previously had to request.
Python Snowflake Cortex AI LLM Integration Semantic Modeling SQL
05
Scenario Modeling

COVID Sequestration Cost Model β€” Excel VBA Scenario Tool for Power Plants

When COVID hit in 2020, plant managers had a real problem: no tool existed to help them think through the cost of different contractor sequestration scenarios. I built one in Excel VBA. Input the plant configuration, choose the scenario, get the projected cost outcomes. Distributed to managers across all five-plus plants. They made real staffing decisions with it during a period when getting those calls wrong was expensive.

Impact: Data-backed decisions during a high-stakes operational period. Plant managers had a tool; before that, they had spreadsheets and guesswork.
Excel VBA Scenario Modeling Cost Analysis Oracle
06
AI Agents

Personal AI Agent Stack β€” Twilio, Make.com, VAPI, and Formspree

Built a stack of personal AI agents using the tools I had: Twilio for SMS and voice channels, VAPI for voice AI, Make.com to wire together automated workflows, and Formspree for form-based triggers. These aren't demos. They handle real tasks β€” routing, notification, response, follow-up β€” without me touching them. This is what AI integration looks like when you stop reading about it and start building.

Impact: Proof that agentic AI workflows are buildable by one practitioner with off-the-shelf tools and a clear problem to solve.
Twilio Make.com VAPI Formspree LLM Integration
07
Content

The Data Nerve β€” Weekly Newsletter on Analytics and AI

A weekly newsletter on LinkedIn for data professionals who want the practical side of AI and analytics β€” not hype, not fluff, actual things that work in real environments. Every issue is written from the perspective of someone who's actually using these tools in production, not summarizing someone else's blog post.

LinkedIn Content Strategy AI Tools Analytics
08
Digital Product

AI-Powered Data Scientist's Playbook β€” Digital Product on Gumroad

A $47 digital product for data professionals who want a practical framework for integrating AI into their analytics workflow. Not a beginner's guide to ChatGPT. A practitioner's playbook built from real experience doing this in a production environment β€” what to prompt, how to structure it, where AI actually helps and where it doesn't.

Gumroad LLM Workflows Python Prompt Engineering

Things I built because I wanted them to exist in the world.

The same frameworks I use in production at a major utility, packaged for data professionals who want to actually use AI in their work -- not just read about it.

Available Now
$47
One-time purchase · Instant download

AI-Powered Data Scientist's Playbook

This isn't a beginner's guide to ChatGPT. It's a practitioner's playbook built from actual experience integrating LLMs into production data workflows -- what to prompt, how to structure it, where AI genuinely helps and where it gets in the way. Written by someone who runs 700 million rows a day through a Snowflake environment and uses Claude and ChatGPT as daily production tools.

Who it's for: Data analysts and scientists who are curious about AI integration but haven't found a framework that connects to real analytics work -- not toy examples, not abstract theory.

What's inside
How to integrate LLMs into Python data pipelines without breaking production
Natural language to SQL -- building a query layer non-technical users can actually use
Prompt engineering for data work (not marketing copy -- actual analytical tasks)
Snowflake Cortex AI and semantic models -- the setup that actually works
Where to start when you're adding AI to a workflow that can't afford downtime
Get the Playbook β€” $47
In Development

Next product

Something for people who need to move their data infrastructure into the current decade.

The Snowflake migration at LG&E took 700 million rows a day from aging on-prem infrastructure to a cloud architecture the whole organization can build on. That process is repeatable -- and worth documenting properly.

A practical guide is coming. The kind that covers what the docs don't: what breaks, what order to do things in, and how to migrate 100-plus reports without blowing up a team's morning routine.

Snowflake Migration Guide for Analytics Teams
Power BI on Snowflake -- the full setup
ETL Pipeline Design for AMI / IoT data at scale
Follow for updates on LinkedIn
Built from real production experience -- not theory
Immediate download after purchase
Written for practitioners, by a practitioner
More products coming -- follow on LinkedIn

If you have a real problem, I'm genuinely interested.

I'm open to conversations about roles, projects, or anything in the data and AI space that's worth talking about. I don't do cold calls or generic networking. But if you've got something specific on your mind, I'll read it and respond.

The fastest way to reach me is LinkedIn. The form works too -- I check it regularly. If you're reaching out about the AI Playbook or the newsletter, those links are in the Products and Newsletter sections above.

Send a message

I'll respond within a day or two. No auto-replies, no sales sequence -- just me reading it.

Message sent. I'll get back to you soon.
Something went wrong -- try emailing me directly at dustincole.ent@gmail.com.

No newsletters, no spam. This goes straight to my inbox.