Course Overview
Please note Learners will be put into a team of 5, where they will collaborate throughout the hack. If you wish to book a dedicated team of 5 the price is £7,500 +vat.
This OpenHack enables attendees to learn how to implement mission-critical data solutions for modern applications.
This OpenHack simulates a real-world scenario where developers need to migrate a monolithic legacy application to a cloud-based NoSQL environment, taking advantage of microservices and event sourcing.
During the “hacking” attendees will focus on:
- Migrating an existing on-premis database to the cloud (using NoSQL as data solution) and then collecting and storing real-time data in the new NoSQL data platform.
- Optimizing the NoSQL database through modeling, implementing a good partition strategy, indexing, denormalization, creating materialized views, and tuning throughput for varying workloads. Next, attendees will implement an event sourcing pattern to optimize, scale, and distribute the processing of streaming data in real-time.
- Expanding search capabilities by enabling full-text, fuzzy, and faceted searches against the data store.
- Configuring replicas across multiple regions worldwide.
By the end of the OpenHack, attendees will have built out a technical solution that incorporates several complementary Azure services for stream processing, data storage, and visualization. These services work together to create an end-to-end modern, flexible, and scalable cloud-native NoSQL data solution to store, process, and access any required business data.
Who should attend
Target Audience:
- Microsoft – CSE, CSA, GBB, ATT, SE, TPM
- Customer –
Target verticals: Cross-Industry
Customer profile:
- Customers looking to migrate from on-prem database to a cloud-based NoSQL environment.
Prerequisites
Knowledge Prerequisites
- To be successful and get the most out of this OpenHack, participants should have familiarity with database concepts such as data modeling, partitioning, denormalization, and indexing. Prior experience with NoSQL databases and familiarity with relational data structures is helpful, but not required.
- Required knowledge of Azure fundamentals.
Language-Specific Prerequisites
- Recommended that participants have previous experience with knowledge of programing languages including C#, JavaScript, Node.JS or Java.
Tooling Prerequisites
To avoid any delays with downloading or installing tooling, have the following ready to go ahead of the OpenHack:
- A modern laptop running Windows 10 (1703 or higher), Mac OSX (10.12 or higher), or one of these Ubuntu versions
- Install your choice of Integrated Development Environment (IDE) software, such as Visual Studio, Visual Studio Code, Eclipse, or IntelliJ.
Development Environment Configuration
- .NET Core (latest version, 3.1) -- required
- .NET Framework (latest version, 4.8) -- optional
Course Objectives
- Gain experience migrating from a relational database to NoSQL, using tools to automate the process
- Learn how to optimize the NoSQL database through modeling, denormalization, indexing, and partitioning, given a set of customer requirements and query patterns
- Enable app modernization through a scalable, distributed, event-based data pipeline, enabled by a flexible NoSQL data store and complimentary Azure services
- Reduce latency and increase availability and customer reach, through global distribution of the NoSQL database
Course Content
Challenge 1: To the cloud
In this challenge, you will provision the NoSQL database.
Learning objectives:
- Provisioning a NoSQL database in Azure, using the following characteristics:
- Enables a simplified process for scaling up and down
- Supports the event sourcing pattern where changes to the data store trigger events that can be processed by any number of listening components in near real-time
- Supports a flexible schema with a multi-region, global distribution
- Store any arbitrary record in the database
Challenge 2: Migrating the database to the cloud
In this challenge, you will migrate all data to the new database.
Learning objectives:
- Create a repeatable process to migrate from the supplied SQL database to the selected NoSQL database, validate the migration through queries, and explain to the coach how the database can be scaled.
Challenge 3: Optimize NoSQL design
In this challenge, you will implement optimizations and demonstrate an improvement in query performance, and/or cost per query.
Learning objectives:
- Estimate the cost per query for reads and writes, as well as query performance
- Use best practices for optimizing the database after evaluating common queries and workloads, then show a measurable improvement after optimization is complete
- Attendees may need to migrate their data once again
Challenge 4: Events are the center of our universe
In this challenge, you will add new features to the solution to support the event sourcing pattern and create a report dashboard.
Learning objectives:
- Create a caching layer for a subset of the data
- Use the event sourcing pattern on order data that flows into the database
- Use these events to create materialized views, real-time dashboards, and cache invalidation
Challenge 5: Improving the search experience
In this challenge, you will implement full-text searching capabilities by creating an index on the title and description fields and add other filters to help users quickly narrow the results.
Learning objectives:
- Enable full-text, fuzzy, and faceted searching capabilities on the database
Challenge 6: Taking over the world (MUAHAHAHA)
In this challenge, you will create a new node or replica of the NoSQL data store within a new Azure region.
Learning objectives:
- Add the NoSQL database to a new region with full replication between both regions
- Help the customer meet data durability and low latency objectives
Technical Scenarios
- Migration to NoSQL – given an existing web application and SQL database, initially perform a raw migration to a Cosmos DB or other NoSQL database in Azure, creating a repeatable process with various tools and services
- NoSQL data modeling and optimization – Evaluating a relational data store, then adapting the schema to a data model, optimized for both write-heavy and read-heavy workloads in NoSQL. Optimization includes combining models as necessary within the same collections, denormalization and embedding, implementing an appropriate indexing strategy for the workloads and query patterns, and selecting an optimal partition strategy
- Event sourcing – Reacting to data events, such as inserts and updates, enabling scenarios such as populating real-time dashboards, creating materialized views (aggregation), and automatic cache invalidation
- Advanced visualizations – UI components that show both real-time and batch data with little to no impact to NoSQL database performance
- Expanding search capabilities – Reaching beyond native indexing and search capabilities provided by the NoSQL database, through an external search service that enables full-text, fuzzy (can handle misspellings), and faceted search against the data catalog
- Global reach – Adding high availability and low latency by replicating data across geographical regions, bringing data closer to users and deployed services