Real Time Reporting with D365 Data Events and Fabric
Description
In this session, we will explore how Dynamics 365 Data Events enables the generation of event messages triggered by CRUD operations on data entities. These messages can be streamed to Azure Event Hubs, consumed by a Fabric Event Stream, and ultimately fed into an Eventhouse that serves as the data source for live Power BI reports.
We will dive into several practical use cases for real-time reporting, then guide you through the setup and configuration process for this powerful solution.
This session provides a technical deep dive into the integration of Dynamics 365, Azure, and Fabric, covering key components, architectural best practices, and real-world considerations.
Key Takeaways
- Understand how to stream D365 data events into Fabric
- Learn to build real-time Power BI dashboards
- Explore performance and latency benchmarks
- Business value of live data
- Common use cases: sales, production,
- Dynamics 365 → Azure → Fabric →
- Key components and data flow
My Notes
Action Items
- [ ]
Resources & Links
Slides
Real-Time Reporting
with D365 Data
Events and Fabric
Tobias Eld
General Manager, Analytics
Fresche Solutions
Agenda
• Session Goals:
• Understand how to stream D365 data events into Fabric
• Learn to build real-time Power BI dashboards
• Explore performance and latency benchmarks
Session Agenda
- Why Real-Time Reporting Matters
- Business value of live data
- Common use cases: sales, production,
warehouse
- Technical Architecture Deep Dive
- Dynamics 365 → Azure → Fabric →
Power BI - Key components and data flow
- Pipeline Setup Walkthrough
- Configuring Data Events, Event Hubs,
Fabric Event Streams, and Event House - Best practices and common pitfalls
- Live Demo: Sales Orders in Action
- Real-time data flow from D365 to Power
BI - Performance and latency metrics
- Extending to Other Scenarios
- Applying the same approach to
production and warehouse data
- Limitations & Optimization Tips
- Constraints, tuning strategies, and
governance
- Q&A: Open floor for audience questions
Session Objectives
What will you learn today?
Understand how to stream D365 FSCM data events into Fabric
2.
Learn to build real-time Power BI dashboards
3.
Explore performance and latency benchmarks
The Case for Real-Time Data
Faster Decision-Making
• Real-time insights empower teams to
act immediately on operational
changes.
• Example: Spot and resolve order
fulfillment issues before they escalate.
Warehouse Data
Track real-time warehouse
performance metrics:
✓ Daily Throughput
✓ Daily Transactional analysis
✓ Order Accuracy
✓ Pick & Pack Cycle times
✓ Order to ship times
✓ Space Utilization
✓ Order Lead Time
Operational Agility
Data Freshness
• React to production delays, inventory
shortages, or sales trends as they happen.
• Improve customer satisfaction and reduce
downtime.
• Traditional reporting relies on batch ETL processes—often
hours or days old.
• Real-time reporting ensures dashboards reflect the current
state of business.
Sales Data
Manufacturing Data
Track and visualize sales
performance metrics including:
✓Attainment to target
Track and visualize manufacturing
data metrics including:
✓Cost per unit
✓Sales Volume
✓Work In Progress
✓Revenue
✓Production Throughput
✓Sales Per Rep
✓Scrap Rates
✓Discounts Provided
✓Yield Rate
✓Planned vs. Actual
New Orders Received
Orders Picked
Outstanding Orders
Updated
Orders Processed
End-to-End Pipeline for Real-Time Reporting
Component Breakdown:
- Dynamics 365 Data Events
• Emits event messages on Create, Update, Delete operations.
• Configured via Business Events framework.
• Payloads are JSON-formatted and include metadata about the entity and action. - Azure Event Hubs
• Acts as the ingestion layer for event messages.
• Supports high-throughput, low-latency streaming.
• Enables partitioning and scaling for large volumes of events. - Fabric Event Stream
• Real-time data processing engine in Microsoft Fabric.
• Connects directly to Event Hubs.
• Allows transformations, filtering, and routing of data. - Event House
• Stores streamed data in a structured format.
• Supports semantic modeling for analytics.
• Serves as the source for Power BI reports. - Power BI Dashboards
• Connects to Event House via DirectQuery.
• Visualizes real-time data with auto-refresh.
• Enables live monitoring of business operations.
Triggering Real-Time Events
What Are Data Events?
• Data Events are part of the Business
Events framework in Dynamics 365.
• They emit structured messages when
Create, Update, or Delete operations
occur on supported entities.
• These messages are sent in JSON format,
containing metadata like:
• Entity name
• Operation type
• Timestamp
• Record ID and field values
Supported Entities:
• Built-in entities such as:
SalesOrderHeader, CustCustomerv3,
ProductionOrder
• Custom Data Entities
Ingesting Events with Azure Event Hubs
What is Azure Event Hubs?
• A high-throughput, real-time data ingestion service.
• Designed to handle millions of events per second.
• Acts as the bridge between Dynamics 365 and downstream consumers
like Fabric.
Processing Events in Real Time with Fabric
What is a Fabric Event Stream?
• A real-time data ingestion and transformation pipeline within Microsoft Fabric.
• Connects directly to Azure Event Hubs to consume event data.
• Enables low-latency processing, filtering, and routing to destinations like Event House.
Event House & Semantic Model
What is Event House?
• A real-time data warehouse in Microsoft Fabric.
• Designed to store and query high-velocity event data.
• Supports Direct Lake mode for ultra-fast Power BI connectivity.
Key Capabilities:
• Schema-on-read: Flexible ingestion of semi-structured data.
• Partitioning: Efficient storage and querying of time-series data.
• Integration with Power BI: Enables live dashboards with minimal latency.
Using KQL Update Policies with Kusto Functions
• Purpose: automatically transform and ingest data into one or more target tables as new data arrives in a source
table—perfect for real-time scenarios like D365 event streams.
How It Works:
• Source Table: Your Event House table receives raw event data from the Fabric Event Stream.
• Kusto Function: You define a KQL function that transforms or filters the incoming data.
• Update Policy: You attach the function to a target table via an update policy, which runs automatically when new data
lands in the source.
Live Demo – Sales Orders in Action
• End-to-End Real-Time Reporting in Action
Performance Benchmarks
End-to-End Delay: ~7–20 seconds from D365
Save to Power BI visualization.
Breakdown:
• D365 to Azure Event Hub: ~3–10 seconds
• Event Hub to Fabric Event Stream: ~1 second
• Event Stream to Event House: <1 second
• Event House to Power BI (DirectQuery): ~1–2 seconds
Throughput
• D365 FSCM:
Refresh Behavior
• Power BI with DirectQuery:
• Near real-time updates without manual refresh.
• Auto-refresh intervals configurable (e.g., every 10–30 seconds).
• Power BI Real Time Dashboards
• Fastest way to report real-time data
• Ideal for Eventhouse
• More user training – less flexible
• Supports 5,000 events per 5 minutes (burst rate)
• Azure Event Hub:
• Supports millions of events per second
• Fabric Event Stream:
• Handles high-frequency ingestion with low overhead.
Extending to Other Scenarios
Production Jobs
• Track job creation, status changes, and completions.
• Monitor delays or bottlenecks in manufacturing workflows.
• Trigger alerts for jobs stuck in specific stages.
Warehouse Transactions
• Visualize picking, packing, and shipping activities.
• Detect inventory movement in real time.
• Improve fulfillment accuracy and reduce cycle time.
Considerations for Scaling:
• Entity-Specific Schemas: Each use case may require tailored transformations.
• Volume & Frequency: Warehouse events may be high-frequency; optimize partitions and stream
logic.
Limitations & Optimization
• Cost
• Different technologies (KQL, Kusto, Eventhubs)
• Data events are based on table updates – views, virtual fields
• Customizations must be matched by customer data entities
• 5,000 event per five minutes – officially
Key take-aways
Key Insights
• Real-time reporting is achievable with native Microsoft tools
• The architecture is modular and scalable across multiple business scenarios.
• Performance and governance are manageable with the right setup and monitoring.
• KQL update policies and semantic modeling are essential for clean, consumable data.
Next Steps
• Identify key entities in your D365 environment for real-time reporting.
• Set up a pilot pipeline for a single use case (e.g., sales orders).
• Explore Fabric’s Real-Time Analytics workspace and Event House capabilities.
• Collaborate with your BI and IT teams to align on governance and performance goals.
Suggested Resources
Business events
overview
Azure Event Hubs
documentation
Microsoft Fabric
documentation
https://learn.microsoft.co
m/enus/dynamics365/fin-opscore/dev-itpro/businessevents/home-page
https://learn.microsoft
.com/enus/azure/event-hubs/
https://learn.microsoft.
com/en-us/fabric/
Sound off.
The mic is all yours.
Influence the product roadmap.
Join the Fabric User Panel
Join the SQL User Panel
Share your feedback directly with our
Fabric product group and researchers.
Influence our SQL roadmap and ensure
it meets your real-life needs
https://aka.ms/JoinFabricUserPanel
https://aka.ms/JoinSQLUserPanel
How was
the session?
Complete Session Surveys in
for your chance to WIN
PRIZES!