freight rate documentation

53
Freight Rate Documentation Release 0.0.1 Allan Nielsen Dec 16, 2020

Upload: others

Post on 29-Oct-2021

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Freight Rate Documentation

Freight Rate DocumentationRelease 0.0.1

Allan Nielsen

Dec 16, 2020

Page 2: Freight Rate Documentation
Page 3: Freight Rate Documentation

CORE CONCEPTS

1 Onion Architecture 11.1 The need to follow an architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Getting Started with Onion Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.3 Advantages of Onion Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 CQRS 52.1 What is CQRS? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.2 Pros of CQRS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.3 Cons of CQRS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

3 MediatR 93.1 Pipelines – Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93.2 MediatR Pipeline Behaviour . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103.3 Getting Started . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

4 Fluent Validation 114.1 The Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114.2 Introducing Fluent Validation – The Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114.3 Fluent Validations with MediatR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

5 Repository Pattern 155.1 What’s a Repository Pattern? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155.2 Benefits of Repository Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

6 Entity Framework Core and Dapper 176.1 Dapper . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176.2 Dapper vs Entity Framework Core . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176.3 Requirement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

7 API Versioning 197.1 What is API Versioning? (and why do you need it?) . . . . . . . . . . . . . . . . . . . . . . . . 197.2 Different Ways to Implement API versioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197.3 Getting Started . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197.4 URL Based API Versioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207.5 Query Based API Versioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217.6 HTTP Header Based API Versioning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227.7 Supporting Multiple API Versioning Schemes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237.8 Deprecating an API Version . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

8 Serilog 258.1 What is Serilog? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 258.2 Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 258.3 Log Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

9 Resources 29

i

Page 4: Freight Rate Documentation

9.1 Uncle Bob’s Clean Code lessons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

10 Currency - Overview 3110.1 Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3110.2 Specifics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

11 Location - Overview 35

12 Sample Onion Implementation 3712.1 Implementing Onion Architecture in ASP.NET Core WebApi Project . . . . . . . . . . . . . . . 3712.2 Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

13 Indices and tables 49

ii

Page 5: Freight Rate Documentation

CHAPTER

ONE

ONION ARCHITECTURE

This describes Onion Architecture as used by the Freight Rate Prototype and it’s perceived advantages. If youwant to build a project that utilises what is described here, see the onion application documentation.

The four tenets of Onion Architecture:

• The application is built around an independent object model

• Inner layers define interfaces. Outer layers implement interfaces

• Direction of coupling is toward the center

• All application core code can be compiled and run separate from infrastructure

Onion Architecture works well with and without DDD patterns. It works well with CQRS, forms over data, andDDD. It is merely an architectural pattern where the core object model is represented in a way that does not acceptdependencies on less stable code.

1.1 The need to follow an architecture

To maintain structural sanity in mid to larger solutions, it is recommended to follow some kind of architecture.You may have seen other open source projects having multiple layers of projects within a complex folder structure.

Example

dotnetcore CAP is a library based on .NET standard, which is a solution to deal with distributed transactions, hasthe function of EventBus, it is lightweight, easy to use, and efficient.

1.1.1 Layers vs Tiers

When there is just a logical separation in your application, we can term it as layers or N-Layers. In cases wherethere is both a physical and logical separation of concerns, it is often referred to as an n-tiered application wheren is the number of separations. 3 is the most common value of N.

This layering can help in the separation of concerns, subdividing the solution into smaller units so that each unit isresponsible for a specific task and also to take advantage of abstraction. For mid to larger scaled projects, layeringhas very obvious advantages. It lets a specific team or individual work on a particular layer without disturbing theintegrity of the others.

Also, it just makes your entire solution look clean, easy to maintain, and for new members, easier to learn.

Before getting into Onion Architecture in the prototype, let’s first refresh our knowledge on N-Layer architecture.

1

Page 6: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

1.1.2 Brief Overview of N-Layer Architecture

Let’s look at one of the most popular architectures in an application. Here is a simple diagrammatic representationof a variant of the N-Layer Architecture. The Presentation Layer usually holds the code that the user can interactwith, i.e. WebApi, MVC, WebForms and so on. Business Logic Layer is probably the most important part of theapplication. It holds all the logic related to the business features. Ideally, every application has it’s own dedicateddatabase. In order to access the database, we introduce a Data Access Layer. This layer usually holds ORMs forASP.NET to fetch/write to the database.

1.1.3 Disadvantages of N-Layer Architecture

To clearly understand the advantages of an Onion Architecture, we will need to study the issues with N-Layerarchitecture. It is one of the commonly used solution architectures amongst .NET developers.

Instead of building a highly decoupled structure, we often end up with several layers that are depending on eachother. This is something really bad in building scalable applications and may pose issues with the growth of thecodebase. To keep it clear, in the above diagram we can see that the presentation layer depends on the logic layer,which in turn depends on the data access and so on.

Thus, we would be creating a bunch of unnecessary couplings. Is it really needed? In most of the cases the UI(presentation) layer would be coupled to the Data Access Layers as well. This would defeat the purpose of havinga clean architecture.

In N-Layer architecture, the database is usually the core of the entire application, i.e. it is the only layer thatdoesn’t have to depend on anything else. Any small change in the business logic or data access layer may provedangerous to the integrity of the entire application.

1.2 Getting Started with Onion Architecture

The Onion Architecture, introduced by Jeffrey Palermo, overcomes the issues of the layered architecture withgreat ease. With Onion Architecture, the major difference is that the Domain Layer (Entities, Validation Rulesand behaviour common to a business use case) is at the core of the entire application. This means higher flexibilityand lesser coupling. In this approach, we can see that the layers are dependent only on inner layers, the arrowsNEVER point outwards.

2 Chapter 1. Onion Architecture

Page 7: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

Here is how I would breakdown the structure of the proposed solution.

1.2.1 Domain and Application Layers

Will be at the center of the design. We can refer to these layers at the Core Layers. These layers will not dependon any other layers.

The Domain Layer usually contains the domain knowledge and behaviours. Application Layer would haveinterfaces and types.

As mentioned earlier, the core layers will never depend on any other layer. Therefore what we do is that we createinterfaces in the application layer and these interfaces get implemented in the external layers. This is also knownas Dependency Inversion principle.

Example

If your application needs to send a mail, we define an IMailService in the application layer and implement itoutside the core layers. Using DI principles, it is easily possible to switch the implementation. This helps buildtestable applications.

1.2.2 Presentation Layer

Where you would put the project that the user or other external applications can access. This could be a WebApi,Mvc Project, etc.

1.2.3 Infrastructure Layer

A bit more tricky, but is where you add your infrastructure. Infrastructure can be anything but is typically theconcrete implementation of external services that provide specific functionality for a specific technology.

Example

An Entity Framework implementation for accessing the database, or a service specifically made to generate JWTTokens for authentication or a Hangfire service.

This should become clearer when we start implementing Onion Architecture in the prototype application.

1.2. Getting Started with Onion Architecture 3

Page 8: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

1.3 Advantages of Onion Architecture

The advantages of this designs is as follows.

• Highly Testable – Since the Core has no dependencies on anything else, writing automated tests are flexible,

• Database Independent – Since we have a clean separation of data access, it is quite easy to switch betweendifferent database providers, e.g. SQL Server and/or MongoDB.

• Switchable UI Layer (Presentation) – Since we are keeping all the crucial logic away from the presentationlayer, it is quite easy to switch to another tech – including Blazor.

• Much cleaner code base with well structured projects for better understanding within teams.

4 Chapter 1. Onion Architecture

Page 9: Freight Rate Documentation

CHAPTER

TWO

CQRS

In this article let’s talk about CQRS and it’s implementation along with MediatR and Entity Framework Coreusing a code first approach. I will implement this pattern on a WebApi Project. The source code of this sample islinked at the end of the post. Of the several design patterns available, CQRS is one of the most commonly usedpatterns that helps architect the solution to accommodate the Onion Architecture.

2.1 What is CQRS?

CQRS, Command Query Responsibility Segregation is a design pattern that separates the read and write opera-tions of a data source. Here Command refers to a database command, which can be either an Insert, Update orDelete operation, whereas Query refers to querying data from a source, put another way, commands affect state,queries do not.

It essentially separates the concerns in terms of reading and writing, which makes quite a lot of sense. This patternwas originated from the Command Query Separation principle devised by Bertrand Meyer.

Wikipedia

Every method should either be a command that performs an action, or a query that returns data to the caller, butnot both. In other words, asking a question should not change the answer. More formally, methods should returna value only if they are referentially transparent and hence possess no side effects.

The problem with traditional architectural patterns is that the same data model or DTO is used to query as well asupdate a data source. This can be the go-to approach when your application is related to just CRUD operationsand nothing more. But when your requirements suddenly start getting complex, this basic approach can prove tobe a disaster.

In practical applications, there is always a mismatch between the read and write forms of data, like the extraproperties you may require to preform an update. Parallel operations may even lead to data loss in the worst cases.That means, you will be stuck with just one Data Transfer Object for the entire lifetime of the application unlessyou choose to introduce yet another DTO, which in-turn may break your application architecture.

The idea with CQRS is to allow an application to work with different models, you may have one model that hasdata needed to update a record, another model to insert a record, yet another to query a record. This gives youflexibility with varying and complex scenarios. You don’t have to rely on just one DTO for the entire CRUDoperation by implementing CQRS. In fact, in many cases the command and query requests and responses are theDTO.

5

Page 10: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

2.2 Pros of CQRS

There are quite of lot of advantages on using the CQRS pattern for your application;

2.2.1 Optimised Data Transfer Objects

Thanks to the segregated approach of this pattern, we will no longer need those complex model classes within ourapplication. Rather we have one model per data operation. These should not be shared.

2.2.2 Highly Scalable

Having control over the models in accordance with the type of data operations makes your application highlyscalable.

2.2.3 Improved Performance

Generally speaking there are usually many more Read operations as compared to Write operations. With thispattern you could speed up the performance on your read operations by introducing a read optimised database.CQRS pattern will support this usage out of the box.

2.2.4 Secure Parallel Operations

Since we have dedicated models per operation, there is no possibility of data loss while doing parallel operations.

6 Chapter 2. CQRS

Page 11: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

2.3 Cons of CQRS

2.3.1 Added Complexity and More Code

The one thing that may concern a few programmers is that this is a code demanding pattern. In other words, youwill end up with more code lines than you usually would. But everything comes for a price. This is a small priceto pay while getting the awesome features and possibilities with the pattern.

2.3. Cons of CQRS 7

Page 12: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

8 Chapter 2. CQRS

Page 13: Freight Rate Documentation

CHAPTER

THREE

MEDIATR

MediatR is a library which essentially can make your controllers thin and decouple the functionality to a moremessage-driven approach. This is generally coupled with an implementation of the CQRS pattern which is Com-mand Query Responsibility Segregation.

The idea of pipelines, MediatR Pipeline Behaviour, how to intersect the pipeline and add various services likeLogging and Validations are going to be covered here.

3.1 Pipelines – Overview

What happens internally when you send a request to any application? Ideally it returns the response, but there isone thing you might not be aware of yet, pipelines. These requests and response travel back and forth throughpipelines. So, when you send a request, the request message passes from the user through a pipeline towardsthe application, where you perform the requested operation with the request message. Once done, the applica-tion sends back the message as a response through the pipeline towards the user-end. Thus these pipelines arecompletely aware of what the request or response is. This is also a very important concept while learning aboutMiddleware in ASP.NET Core web sites and apis.

Here is an image depicting the above mentioned concept.

Let’s say I want to validate the request object. How would you do it? You would basically write the validation logicwhich executes after the request has reached the end of the pipeline towards the application. That means, you arevalidating the request only after it has reached inside the application. Why would you attach the validation logicto the application when you can already validate the incoming requests even before it hits any of the applicationlogic?

A better approach would be to somehow wire up your validation logic within the pipeline, so that the flow becomes;

1. user sends request through pipeline,

2. validation logic intercepts request,

3. if request is valid, continue, else throws a validation exception.

This makes quite a lot of sense in terms of efficiency. Why hit the application with invalid data, when you couldfilter it out much before?

9

Page 14: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

This is not only applicable for validation, but for various other operations like logging, performance tracking andmuch more.

3.2 MediatR Pipeline Behaviour

MediatR takes a more pipeline kind of approach where your queries, commands and responses flow through apipeline. The Pipeline Behaviour was made available from Version 3 of this library.

MediatR requests or commands are like the first contact within our application, so why not attach some be-haviours? By doing this, we will be able to execute service logic like validation even before the Command orQuery handlers know about it. This way, we will be sending only necessary valid requests to the CQRS Imple-mentation. Logging using this Pipeline Behavior is also a common implementation.

3.3 Getting Started

We will use the CQRS where I already have setup MediatR. We will be adding Validation (via Fluent Validation)and general logging to the commands and requests that go through the pipeline.

Here is what we are going to build, in our CQRS solution, we are going to add validation and logging via theMediatR pipeline.

Thus, any <Feature>Command or <Feature>Query request would be validated even before it hits the applicationlogic. Also, we will log every request and response that goes through the MediatR pipeline.

10 Chapter 3. MediatR

Page 15: Freight Rate Documentation

CHAPTER

FOUR

FLUENT VALIDATION

When it comes to validating models, we normally lean towards data annotations. There are quite a lot of seriousissues with this approach for a scalable system. There is a library, Fluent Validation that can turn up the validationgame to a whole new level, giving you total control without mixing concerns.

In this article, we will talk about Fluent Validation and it’s implementation in the prototype. We will discuss thepreferred alternative to data annotations and implement it in a sample.

4.1 The Problem

Data validation is extremely vital for any application. The go to approach for model validation in most demos isdata annotations, where you have to declare attributes over the property of models.

public class Developer{

[Required]public string FirstName { get; set; }[Required]public string LastName { get; set; }[EmailAddress]public string Email { get; set; }[Range(minimum:5,maximum:20)]public decimal Experience { get; set; }

}

It is fine for beginners and demos, but once you start learning clean code, or begin to understand the SOLIDprinciples of application design, you would just never be happy with data annotations as you were before. It isclearly not a good approach to combine your model and validation logic.

With the implementation of data annotations in .NET classes, the problem is that there will be a lot of duplicatedlines of code throughout your application. What if the developer’s model class is to be used in another applica-tion/method where this attribute validation changes? What if you need to validate a model that you don’t haveaccess to? Unit testing can get messy as well. You will definitely end up building multiple model classes whichwill no longer be maintainable.

4.2 Introducing Fluent Validation – The Solution

Fluent Validation is a free to use .NET validation library that helps you make your validation logic clean, easyto create, and maintain. It even works on external models that you don’t have access to, with ease. With thislibrary, you can separate the model classes from the validation logic like it is supposed to be. Also, better controlof validation is something that makes a developer prefer Fluent Validation.

Fluent Validation uses lamba expressions to build validation rules.

11

Page 16: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

4.3 Fluent Validations with MediatR

For validating our MediatR requests, we will use the Fluent Validation library. Let use the following use case asan example:

Use Case

We have an API endpoint that is responsible for creating a product in the database from the request object thatincludes product name, prices, barcode and so on. But we would want to validate this request in the pipeline itself.

Here are the MediatR Request and Handler

public class CreateProductCommand : IRequest<int>{

public string Name { get; set; }public string Barcode { get; set; }public string Description { get; set; }public decimal BuyingPrice { get; set; }public decimal Rate { get; set; }

public class CreateProductCommandHandler : IRequestHandler→˓<CreateProductCommand, int>

{private readonly IApplicationContext _context;public CreateProductCommandHandler(IApplicationContext context){

_context = context;}public async Task<int> Handle(CreateProductCommand command,

→˓CancellationToken cancellationToken){

var product = new Product();product.Barcode = command.Barcode;product.Name = command.Name;product.BuyingPrice = command.BuyingPrice;product.Rate = command.Rate;product.Description = command.Description;_context.Products.Add(product);await _context.SaveChangesAsync();return product.Id;

}}

}

Add a new folder to the root of our application and name it Validators. Here is where we are going to add all thevalidators related to the domain. Since we are going to validate the CreateProductCommand object, let’s followconvention and name our validator CreateProductCommandValidator. So add a new file to the validators foldernamed CreateProductCommandValidator.

public class CreateProductCommndValidator : AbstractValidator<CreateProductCommand>{

public CreateProductCommndValidator(){

RuleFor(c => c.Barcode).NotEmpty();RuleFor(c => c.Name).NotEmpty();

}}

We will keep things simple for this article. We create 2 rules, checking if the Name and Barcode numbers are notempty. You could take this a step further by injecting a DbContext to this constructor and chack if the barcodealready exists.

12 Chapter 4. Fluent Validation

Page 17: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

We have n number of similar validators for each command and query. This helps keep the code well organizedand easy to test.

Before continuing, let’s register this validator with the DI container. Navigate to Startup.cs ConfigureServicesmethod and add in the following line.

services.AddValidatorsFromAssembly(typeof(Startup).Assembly);

This essentially registers all the validators that are available within the current assembly.

Now that we have our validator set up, let’s add it to the pipeline behaviour. Create another new folder in the rootof the application and name it PipelineBehaviours. Here, add a new class, ValidationBehaviour.cs.

public class ValidationBehaviour<TRequest, TResponse> : IPipelineBehavior<TRequest,→˓ TResponse> where TRequest : IRequest<TResponse>{

private readonly IEnumerable<IValidator<TRequest>> _validators;

public ValidationBehaviour(IEnumerable<IValidator<TRequest>> validators){

_validators = validators;}

public async Task<TResponse> Handle(TRequest request, CancellationToken→˓cancellationToken, RequestHandlerDelegate<TResponse> next)

{if (!_validators.Any())

return await next();

var context = new ValidationContext<TRequest>(request);var validationResults = await Task.WhenAll(_validators.Select(v => v.

→˓ValidateAsync(context, cancellationToken)));var failures = validationResults.SelectMany(r => r.Errors).Where(f => f !=

→˓null).ToList();if (failures.Count != 0)

throw new FluentValidation.ValidationException(failures);}

}

We have one last thing to do. Register this pipeline behaviour in the DI container. Again, go back to Startup.csConfigureServices and add this.

services.AddTransient(typeof(IPipelineBehavior<,>), typeof(ValidationBehaviour<,>→˓));

Since we need to validate each and every request, we add it with a Transient scope.

That’s it, quite simple to setup.

4.3. Fluent Validations with MediatR 13

Page 18: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

14 Chapter 4. Fluent Validation

Page 19: Freight Rate Documentation

CHAPTER

FIVE

REPOSITORY PATTERN

5.1 What’s a Repository Pattern?

A Repository pattern is a design pattern that mediates data from and to the Domain and Data Access Layers (likeEntity Framework Core/Dapper). Repositories are classes that hide the logic required to store or retrieve data.Thus, our application will not care about what kind of ORM we are using, as everything related to the ORM ishandled within a repository layer. This allows you to have a cleaner separation of concerns. Repository pattern isone of the heavily used Design Patterns to build cleaner solutions.

5.2 Benefits of Repository Pattern

5.2.1 Reduces Duplicate Queries

Imagine having to write lines of code to just fetch some data from your datastore. Now what if this set of queriesare going to be used in multiple places in the application. Not very ideal to write this code over and over again,right? Here is the added advantage of Repository classes. You could write your data access code within theRepository and call it from multiple Controllers/Libraries.

5.2.2 De-couples the application from the Data Access Layer

There are quite a lot of ORMs available for .NET Core. Currently the most popular one is Entity FrameworkCore. But that could change in the upcoming years. To keep pace with the evolving technologies and to keep oursolutions up to date, it is crucial to build applications that can switch over to a new data access technology withminimal impact on our application’s code base.

There can be also cases where you need to use multiple ORMs in a single solution. Probably Dapper to fetch thedata and EF Core to write the data. This is solely for performance optimizations.

Repository pattern helps us to achieve this by creating an abstraction over the data access layer. Now, you nolonger have to depend on EF Core or any other ORM for your application. EF Core becomes one of your optionsrather than your only option to access data.

Tip

The Architecture should be independent of the Frameworks. – Uncle Bob (Robert Cecil Martin)

Building an enterprise level .NET Core application would really need repository patterns to keep the codebasefuture proof for at least the next 20-25 years (After which, probably the robots would take over ).

15

Page 20: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

5.2.3 Is Repository Pattern Dead?

This is one of the most debated topics within the .NET Core community. Microsoft has built Entity FrameworkCore using the repository pattern and unit of work patterns. So, why do we need to add another layer of abstractionover the Entity Framework Core, which is yet another abstraction of data access. The answer to this is also givenby Microsoft.

Read more here

Microsoft themselves recommend using repository patterns in complex scenarios to reduce the coupling and pro-vide better testability of your solutions. In cases where you want the simplest possible code, you would want toavoid the repository pattern.

Adding the repository has it’s own benefits too. But i strongly advice to not use design patterns everywhere. Tryto use it only whenever the scenario demands the usage of a design pattern. That being said, repository pattern issomething that can benefit you in the long run.

16 Chapter 5. Repository Pattern

Page 21: Freight Rate Documentation

CHAPTER

SIX

ENTITY FRAMEWORK CORE AND DAPPER

The describes using Entity Framework Core and Dapper together in the same application. Another major pointof discussion will be transactions. By the end of this, we will have an application that works with both EntityFramework Core and Dapper alongside each other, but also intelligent enough to rollback data whenever there isan exception with the process.

We will also take a look at ThrowR, a simple library for .NET Core that can make your code clean and readableby eliminating the use of if statements unnecessarily.

6.1 Dapper

Dapper is a simple Object Mapping Framework or Micro-ORM that helps us to map the data from the result of anSQL Query to a .NET Class efficiently. It would be as simple as executing a SQL Select statement using the SQLClient object and returning the result as a mapped C# class. It’s more like AutoMapper for the SQL world. Thispowerful ORM was build by the folks at StackOverflow and is definitely faster at querying data when comparedto the performance of Entity Framework. This is possible because Dapper works directly with the RAW SQL andhence the time-delay is quite less. This boosts the performance of Dapper.

6.2 Dapper vs Entity Framework Core

Dapper is literally much faster than Entity Framework Core considering the fact that there are no bells and whistlesin Dapper. It is a straight forward Micro ORM that has minimal features as well. It is always up to the developer tochoose between these 2 data access technologies. This does not mean that Entity Framework Core is any slower.With every update, the performance seems to be improving as well. Dapper is heaven for those who still like towork with raw queries rather than LINQ with EF Core.

Now, Entity Framework Core has tons of features included along with performance improvements as well. So thequestion is, why choose between Dapper and Entity Framework Core when you can use both and take maximumadvantage of both?

Dapper is able to handle complex queries that have multiple joins and some real long business logic. EntityFramework Core is great for class generation, object tracking, mapping to multiple nested classes, and quite a lotmore. So it’s usually performance and features when talking about these 2 ORMs.

17

Page 22: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

6.3 Requirement

We will design a simple ASP.NET Core WebAPI for an imaginary company. This company has a policy thatsays every other Employee has to be linked to a unique Department. To be more clear, every time you add anew employee via the API endpoint, you have to create a new department record as well. A very imaginaryrequirement, yeah? Along with this, we will have 2 other endpoints that return all Employees and Employee byId.

Expanding on the details, we will have to ensure the newly added Department does not already exist. You will geta grasp of this once you get to see the Domain entities.

To demonstrate the usage of Dapper, Entity Framework Core, and both combined, we will implement them eachin the 3 Endpoints. For the GetAll Endpoints, we will use Dapper. The GetById Endpoint would use EntityFramework Core with eager loading to display the Department details as well. And finally, the POST Endpointwould take advantage of both these data access technologies and cleanly demonstrate transactions in .NET Core.

Along the way, we will get introduced to few libraries for .NET Core that could probably save you some develop-ment time as well.

6.3.1 Important Aspect to Handle – Transactions

Now, according to our requirement, we need both Entity Framework Core and Dapper to work alongside eachother. This is quite easy to achieve actually. But the important detail to take care of is that we need to ensure thatboth Entity Framework Core and Dapper should participate in the same DB Transaction so that the overall processcan be robust.

For example, a particular Write Operation can involve multiple entities and tables. This in turn can have operationsthat are easy to be handled by Entity Framework Core, and let’s say a bunch of complex Queries that are meant tobe executed by Dapper. In such cases, we must make sure that it should be possible to rollback the SQL Executeoperation when any operation/query fails. This is the aspect that can introduce a small complexity to our systemdesign.

If we are not considering this, the overall process would be so straight forward. Let me put the idea into steps.

1. Configure Entity Framework Core.

2. Configure Dapper. You can achieve this by using the same connection string that is being used by EFCoreas well. (Obviously, from appsettings.json)

3. Register the services into the Container and start using the Context / Dapper as required.

But we will go for a more complex and future proof mechanism that will handle really everything includingRollbacks and Transactions.

18 Chapter 6. Entity Framework Core and Dapper

Page 23: Freight Rate Documentation

CHAPTER

SEVEN

API VERSIONING

7.1 What is API Versioning? (and why do you need it?)

Before deploying an API, there is a checklist of a few features that are considered vital. API versioning tops thatlist. It is highly crucial to anticipate changes that may be required once the API is published and a few clients arealready using it. After publishing the API to a production server, we have to be careful with any future change.These changes should not break the existing client applications that are already using our API. It is also not agood idea to go on and change the API calls in each and every client application. This is how the concept of APIversioning came about.

API versioning is a technique by which different clients can get different implementations of the same controllerbased on the request or the URL. So essentially, you build an API that has multiple versions that may behavedifferently. This is a great approach to accommodate the fact that requirements can change at any given time, andwe do not have to compromise the integrity and availability of data for our already existing client applications.

7.2 Different Ways to Implement API versioning

There are multiple ways to achieve API versioning in applications. The commonly used approach to version aWebApi are as follows:

• Query String based.

• URL based.

• HTTP Header based.

Note: There are other ways like media type, accept-header, that can be quite complex, from a practical point ofview, I believe these 3 are the go-to approaches while versioning an API.

7.3 Getting Started

7.3.1 Installing the Package

Microsoft has it’s own package to facilitate the process of versioning .NET Core APIs. This Package is calledMicrosoft.AspNetCore.Mvc.Versioning.

Install-Package Microsoft.AspNetCore.Mvc.Versioning

19

Page 24: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

7.3.2 Configuring the application

Navigate to the Startup class of the API Project and modify the Configure Services method to support versioning.

public void ConfigureServices(IServiceCollection services){

#region Api Versioning

services.AddApiVersioning(config =>{

// Specify the default API Version as 1.0config.DefaultApiVersion = new ApiVersion(1, 0);// If the client hasn't specified the API version in the request, use the

→˓default API version numberconfig.AssumeDefaultVersionWhenUnspecified = true;// Advertise the API versions supported for the particular endpointconfig.ReportApiVersions = true;

});#endregion

services.AddControllers();}

7.4 URL Based API Versioning

Personally, this is my favorite approach and I implement this to nearly all of the APIs that I work on. It gives aclean separation between different versions. Versions are explicitly mentioned in the URL of the API endpoints.Here is how it would look like.

https://secureddata.com/api/v1/users https://secureddata.com/api/v2/users

7.4.1 Implementation

Here is the code for v1/DataController

namespace Versioning.WebApi.Controllers.v1{

[ApiVersion("1.0")][Route("api/v{version:apiVersion}/[controller]")][ApiController]public class DataController : ControllerBase{

[HttpGet]public string Get(){

return "data from api v1";}

}}

And here goes the v2/DataController

namespace Versioning.WebApi.Controllers.v2{

[ApiVersion("2.0")][Route("api/v{version:apiVersion}/[controller]")][ApiController]public class DataController : ControllerBase{

(continues on next page)

20 Chapter 7. API Versioning

Page 25: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

(continued from previous page)

[HttpGet]public string Get(){

return "data from api v2";}

}}

If you notice the difference in the addresses, we have already implemented the API versioning. URL basedversioning is more conventional as the clients and fellow developers are able to make out the version by seeing theURL itself. It is easier to develop APIs using this approach. We will move forward to Query Based Versioning onthe same application.

7.5 Query Based API Versioning

While the previous approach has the version within the URL, this technique allows you to pass the version of theAPI as a parameter in the URL, i.e. a query string. Here is how the URL may look like.

https://secureddata.com/api/users?api-version=1 https://secureddata.com/api/users?api-version=2

7.5.1 Implementation

There is not much of a difference from the previous implementation in terms of the code. Just change each of thecontrollers to [Route(“api/[controller]”)] instead of [Route(“api/v{version:apiVersion}/[controller]”)].

7.5. Query Based API Versioning 21

Page 26: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

When you pass a api version that does not exist you will receive an error

It throws an UnsupportedAPIVersion exception saying that there is not a resource that matches API Version 999.

7.6 HTTP Header Based API Versioning

This approach is slightly different compared to the previous implementations and to test this, we need Postmanor Swagger. I will use Postman in this demonstration to test the implementation. In the previous 2 techniques,we had the Version number either within the URL or passed it to the application via a query string. Here we willpass the API version as HTTP Header in our request. This is exactly why it cannot be demonstrated using just abrowser.

7.6.1 Implementation

For this approach, we will have to modify our Startup class to support the reading of API version from HTTPHeader.

public void ConfigureServices(IServiceCollection services){

#region Api Versioning// Add API Versioning to the Projectservices.AddApiVersioning(config =>{

// Specify the default API Version as 1.0config.DefaultApiVersion = new ApiVersion(1, 0);// If the client hasn't specified the API version in the request, use the

→˓default API version numberconfig.AssumeDefaultVersionWhenUnspecified = true;// Advertise the API versions supported for the particular endpointconfig.ReportApiVersions = true;//HTTP Header based versioningconfig.ApiVersionReader = new HeaderApiVersionReader("x-api-version");

});#endregionservices.AddControllers();

}

Run the API and open up Postman. Paste the URL as below and add a Header Key , “x-api-version” with therequired API Version. Here are the results.

22 Chapter 7. API Versioning

Page 27: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

This usage of versioning can be a bit challenging for Clients and may increase the lines of codes in the Client App.

7.7 Supporting Multiple API Versioning Schemes

It is possible to apply all these approaches to your API out of the box by making these simple changes and let theclient choose what he/she will use.

Firstly, add the route that supports URL based routing. Make sure each of your API Controllers has the followingattribute.

[Route(“api/[controller]”)][Route(“api/v{version:apiVersion}/[controller]”)]

Next, We will have to combine the HTTP and Query based versioning. For this go back to the ConfigService ofthe Startup class and modify our last line of code in the ApIVersioning extension.

config.ApiVersionReader = ApiVersionReader.Combine(new HeaderApiVersionReader("x-→˓api-version"), new QueryStringApiVersionReader("api-version"));

Now you will be able to use all the 3 API versioning Schemes seamlessly.

7.7. Supporting Multiple API Versioning Schemes 23

Page 28: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

7.8 Deprecating an API Version

Advertising the supported API versions of an endpoint, the versions that are going to be removed in the near futurecan also be advertised on the HTTP response Header. This can be done by modifying the Route attribute of thespecific version of the API Controller.

// DEPRECATING an API Version[ApiVersion("1.0", Deprecated = true)]

24 Chapter 7. API Versioning

Page 29: Freight Rate Documentation

CHAPTER

EIGHT

SERILOG

In this article, we’ll go through Serilog and it’s implementation. By default, ASP.NET Core comes with somebasic logging features built-in. You must have seen the ILogger interface throughout .NET Core applicationdemos. Serilog is used if we want more control over how and where to log the details? That is where LoggingFrameworks come into play. Out of all of them, Serilog is one of the most popular Libraries for .NET Coreapplications.

8.1 What is Serilog?

Serilog is a third-party logging library that plugs into the default ILogger of our application with its own imple-mentations. It enables the developers to log the events into various destinations like console, file, database, andmore. Now, if you are already using a database in your .NET Core application, logging events to a database canbe a good option. Serilog supports structured logging, which allows more details and information about the eventto be logged. With structured logging in place, you could use these logs to debug in a very logical way.

8.2 Setup

To implement Serilog on an .NET Core WebApplication (Razor Pages). Since our focus is on logging and under-standing various related concepts, we will keep the project setup simple and straight-forward.

8.2.1 Logging with the Default Logger

As I had mentioned earlier, .NET Core applications ship with a default built-in logging system which includessome basic logging functions. To understand logging, let’s see how the basic logger works. Once you have cre-ated your WebApplication solution, navigate to Pages/Index.cshtml/Index.cshtml.cs. You can see the constructorinjection of the ILogger interface. This is the default logger from Microsoft.

In the OnGet method of the IndexModel, let’s add a way to demonstrate logging and also use the try-catch block.Here I will throw a Dummy Exception so that we can understand logging better. Also note that we will not bechanging anything on the class further in this demonstration.

public void OnGet(){

_logger.LogInformation("Requested the Index Page");int count;try{

for(count = 0;count<=5;count++){

if(count==3)throw new Exception("RandomException");

}}

(continues on next page)

25

Page 30: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

(continued from previous page)

catch (Exception ex){

_logger.LogError(ex,"Exception Caught");}

}

The OnGet method is fired every time you request for the Index Page (Home Page). So, as the code suggests, I amlogging a message that says “Requested the Index Page” every time you request for this page. After that it runsa loop 5 times, and if the iteration count is 3, it throws a dummy exception “RandomException” which in turngets caught in the catch block. This is logged as an error. This way, we have a function that mimics a practicalproduction level function.

8.3 Log Levels

Logging Levels are the fundamental concept of logging. When we wrote ‘_logger.LogInformation(“Requestedthe Index Page”);’, we mentioned to the application that this is a log with the log-level set to Information. Loglevels make sense because it allows you to define the type of log. Is it a critical log? just a debug message? awarning message?

There are 6 log-levels included:

• Trace – Detailed messages with sensitive app data.

• Debug – Useful for the development environment.

• Information – General messages, like the way we mentioned earlier.

• Warning – For unexpected events.

• Error – For exceptions and errors.

• Critical – For failures that may need immediate attention.

Note that Serilog may or may not have the same names for each level.

8.3.1 Default Log Settings

The default settings for our logger is mentioned in appsettings.json. These settings allows you to define on whatlevel of logs you need from a particular component. For example, any log messages that is generated by theapplication (Microsoft) with levels Warning and above is logged to the console. This is the basic idea of logsettings.

{"Logging": {

"LogLevel": {"Default": "Information","Microsoft": "Warning","Microsoft.Hosting.Lifetime": "Information"

}}

}

With that out the way, let’s start the actual implementation of Serilog in our .NET Core application.

26 Chapter 8. Serilog

Page 31: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

8.3.2 Serilog Enrichers

To enable Structured Logging and to unleash the full potential of Serilog, we use enrichers. These enrichersgive you additional details like Machine Name, ProcessId, Thread Id when the log event had occurred for betterdiagnostics. It makes a developer’s life quite simple. We will use the enrichers later in this guide.

8.3.3 Serilog Sinks

Serilog Sinks in simpler words relate to destinations for logging the data. In the packages that we are going toinstall to our .NET Core application, Sinks for Console and File are included out of the box. That means wecan write logs to Console and File System without adding any extra packages. Serilog supports various otherdestinations like MSSQL, SQLite, SEQ and more.

8.3.4 Installing the Required Packages

For now, these are the packages that you require. Install them via the NuGet Package Manager or Console.

Install-Package Serilog.AspNetCoreInstall-Package Serilog.Settings.ConfigurationInstall-Package Serilog.Enrichers.EnvironmentInstall-Package Serilog.Enrichers.ProcessInstall-Package Serilog.Enrichers.Thread

8.3.5 Configuring Serilog

Our intention is to use Serilog instead of the default logger. For this, we will need to configure Serilog at the entrypoint of our .NET Core Application, i.e. the Program.cs file. Navigate to Program.cs and make the followingchanges;

public static IHostBuilder CreateHostBuilder(string[] args) =>Host.CreateDefaultBuilder(args)

.UseSerilog() //Uses Serilog instead of default .NET Logger

.ConfigureWebHostDefaults(webBuilder =>{

webBuilder.UseStartup<Startup>();});

public static void Main(string[] args){

//Read Configuration from appSettingsvar config = new ConfigurationBuilder()

.AddJsonFile("appsettings.json")

.Build();//Initialize LoggerLog.Logger = new LoggerConfiguration()

.ReadFrom.Configuration(config)

.CreateLogger();

try{

Log.Information("Application Starting.");CreateHostBuilder(args).Build().Run();

}catch (Exception ex){

Log.Fatal(ex, "The Application failed to start.");}

(continues on next page)

8.3. Log Levels 27

Page 32: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

(continued from previous page)

finally{

Log.CloseAndFlush();}

}

8.3.6 Setting up Serilog

Navigate to appsettings.json and remove the default logging settings and replace it with the following.

{"AllowedHosts": "*","Serilog":{

"Using": [],"MinimumLevel": {"Default": "Information","Override":{

"Microsoft": "Warning","System": "Warning"

}},"WriteTo": [{

"Name": "Console"},{

"Name": "File","Args": {"path": "D:\\Logs\\log.txt","outputTemplate": "{Timestamp} {Message}{NewLine:1}{Exception:1}"}

}],"Enrich": [

"FromLogContext","WithMachineName","WithProcessId","WithThreadId"

],"Properties": {

"ApplicationName": "Serilog.WebApplication"}

}

28 Chapter 8. Serilog

Page 33: Freight Rate Documentation

CHAPTER

NINE

RESOURCES

Below you will find many of the resources I have used in constructing the prototype. Trying to adhere to theprincipals of clean code and onion architecture. My biggest influence has always been Bob Martin’s teachings andI strive to teach others the same. If you have a spare weekend, sit down and binge watch the Clean Code lessonsbelow and take in the principals he teaches.

9.1 Uncle Bob’s Clean Code lessons

Note: My favourite quote from all these lessons “You know you are working on clean code when each routineyou read turns out to be pretty much what you expected. . . ” by Ward Cunningham

29

Page 34: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

30 Chapter 9. Resources

Page 35: Freight Rate Documentation

CHAPTER

TEN

CURRENCY - OVERVIEW

The currency module’s purpose is to provide Exchange Rates for a given date using a base currency and anothercurrency.

The API endpoint is the easiest and fastest way to access the exchange rate data.

Choose your base currency and the endpoint will simply return the conversion rates from your base currency codeto all the other supported in an easy to parse JSON format.

Business Rules

• Currencies need to use 3 alpha character codes as per ISO 4217 on requests. We use ISO 4217Three Letter Currency Codes - e.g. USD for US Dollars, EUR for Euro etc. Here are the codeswe support.

• Exchange Rates imported once a day (using external 3rd party service ExchangeRate-Api) i.e.GET https://v6.exchangerate-api.com/v6/YOUR-API-KEY/latest/USD

• If an exchange rate isn’t available for a date, the closest previous date should be used.

• If a currency is not supported on a request, an exception should be thrown.

• Comparing Exchange Rates, use standard compare method on effective dates first, from cur-rency next and to currency last.

10.1 Features

• gRPC services

• MediatR

• Fluent Validation

10.2 Specifics

10.2.1 Supported Currencies

Support Exchange Rate API requests for the currency codes listed below. Please note that ISO 4217 Three LetterCurrency Codes can be deprecated and replaced over time. An example would be Mexico - which used MXP priorto the 1993 currency revaluation but now uses MXN. Some online lists of currency codes still use old codes - socheck here if you’re uncertain about which codes to include in your currency conversion project.

You can read more about ISO 4217 codes on Wikipedia.

Code Name CountryAED UAE Dirham United Arab EmiratesARS Argentine Peso Argentina

continues on next page

31

Page 36: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

Table 1 – continued from previous pageAUD Australian Dollar AustraliaBGN Bulgarian Lev BulgariaBRL Brazilian Real BrazilBSD Bahamian Dollar BahamasCAD Canadian Dollar CanadaCHF Swiss Franc SwitzerlandCLP Chilean Peso ChileCNY Chinese Renminbi ChinaCOP Colombian Peso ColombiaCZK Czech Koruna Czech RepublicDKK Danish Krone DenmarkDOP Dominican Peso Dominican RepublicEGP Egyptian Pound EgyptEUR Euro GermanyEUR Euro AustriaEUR Euro BelgiumEUR Euro CyprusEUR Euro EstoniaEUR Euro FinlandEUR Euro FranceEUR Euro GreeceEUR Euro IrelandEUR Euro ItalyEUR Euro LatviaEUR Euro LithuaniaEUR Euro LuxembourgEUR Euro MaltaEUR Euro NetherlandsEUR Euro PortugalEUR Euro SlovakiaEUR Euro SloveniaEUR Euro SpainFJD Fiji Dollar FijiGBP Pound Sterling United KingdomGTQ Guatemalan Quetzal GuatemalaHKD Hong Kong Dollar Hong KongHRK Croatian Kuna CroatiaHUF Hungarian Forint HungaryIDR Indonesian Rupiah IndonesiaILS Israeli New Shekel IsraelINR Indian Rupee IndiaISK Icelandic Krona IcelandJPY Japanese Yen JapanKRW South Korean Won South KoreaKZT Kazakhstani Tenge KazakhstanMVR Maldivian Rufiyaa MaldivesMXN Mexican Peso MexicoMYR Malaysian Ringgit MalaysiaNOK Norwegian Krone NorwayNZD New Zealand Dollar New ZealandPAB Panamanian Balboa PanamaPEN Peruvian Sol PeruPHP Philippine Peso PhilippinesPKR Pakistani Rupee Pakistan

continues on next page

32 Chapter 10. Currency - Overview

Page 37: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

Table 1 – continued from previous pagePLN Polish Zloty PolandPYG Paraguayan Guarani ParaguayRON Romanian Leu RomaniaRUB Russian Ruble RussiaSAR Saudi Riyal Saudi ArabiaSEK Swedish Krona SwedenSGD Singapore Dollar SingaporeTHB Thai Baht ThailandTRY Turkish Lira TurkeyTWD New Taiwan Dollar TaiwanUAH Ukrainian Hryvnia UkraineUSD United States Dollar United StatesUYU Uruguayan Peso UruguayZAR South African Rand South Africa

10.2.2 gRPC service

The ExchangeRateService

10.2. Specifics 33

Page 38: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

34 Chapter 10. Currency - Overview

Page 39: Freight Rate Documentation

CHAPTER

ELEVEN

LOCATION - OVERVIEW

35

Page 40: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

36 Chapter 11. Location - Overview

Page 41: Freight Rate Documentation

CHAPTER

TWELVE

SAMPLE ONION IMPLEMENTATION

We will build a WebApi that follows a variant of Onion Architecture so that we get to see why it is important toimplement such an architecture. You can find the source code of this implementation somewhere yet to be defined.

Another more fully implemented demonstration can be found at Onion-DevOps-Architecture which is written andmaintained by Jeff.

12.1 Implementing Onion Architecture in ASP.NET Core WebApiProject

To keep things simple but demonstrate the architecture to the fullest, we will build an ASP.NET Core Web APIthat is quite scalable. For this article, Let’s have a WebApi that has just one entity, Product. We will performCRUD Operations on it while using the Onion architecture. This will give you quite a clear picture.

Here is a list of features and tech we will be using for this setup.

• Onion Architecture

• Entity Framework Core

• .NET Core 3.1 Library / .NET Standard 2.1 Library / ASP.NET Core 3.1 WebApi

• Swagger

• CQRS / Mediator Pattern using MediatR Library

• Wrapper Class for Responses

• CRUD Operations

• Inverted Dependencies

• API Versioning

12.1.1 Setting up the Solution Structure

We will start off by creating a Blank Solution on Visual Studio.

37

Page 42: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

Let’s give it a proper Name.

Under the Blank Solution, add 3 new folders.

• Core – will contain the Domain and Application layer Projects

• Infrastructure – will include any projects related to the Infrastructure of the ASP.NET Core 3.1 Web Api(Authentication, Persistence etc)

• Presentation – The Projects that are linked to the UI or API . In our case, this folder will hold the APIProject.

Let’s start adding the required projects. Firstly, under Core Folder Add a new .NET Standard Library and name itDomain.

Why .NET Standard? We know that Domain and Application Project does not depend on any other layers. Alsothe fact that these projects can be shared with other solutions if needed (Maybe another solution that is not .NETCore, but .NET Framework 4.7) . Get the point?

38 Chapter 12. Sample Onion Implementation

Page 43: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

Note: A wise person once said – “Delete the Default Class1 Created by Visual Studio. Always Delete them.”

After creating the Domain project, right click on properties and change the target framework to .NET Standard2.1 (which is the latest .NET Standard version at the time of writing this article.)

Similary, create another .NET Standard Library Project in the Core Folder. Name it Application. Do not forget tochange the target version here as well.

Next, let’s go to the Infrastructure Folder and add a layer for Database, (EFCore). This is going to be a .NET CoreLibrary Project. We will name it Persistence.

Finally, in the Presentation layer, add a new ASP.NET Core 3.1 WebApi Project and name it WebApi.

This is what we will be having right now. You can see the clear seperation of concerns as we have read earlier.Let’s start build up the architecture now.

12.1. Implementing Onion Architecture in ASP.NET Core WebApi Project 39

Page 44: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

12.1.2 Adding Swagger To WebApi Project

Tip #1

Always use Swagger while working with WebApis. It is so much helpful to have it.

Install the Following packages ot the WebApi Project via Package Manager Console

Install-Package Swashbuckle.AspNetCoreInstall-Package Swashbuckle.AspNetCore.Swagger

We will have to register Swager within the application service container. Navigate to ../Startup.cs and add theselines to the ConfigureServices method.

#region Swaggerservices.AddSwaggerGen(c =>{

c.IncludeXmlComments(string.Format(@"{0}\OnionArchitecture.xml", System.→˓AppDomain.CurrentDomain.BaseDirectory));

c.SwaggerDoc("v1", new OpenApiInfo{

Version = "v1",Title = "OnionArchitecture",

});});#endregion

Then, add these lines to the Configure method.

#region Swagger// Enable middleware to serve generated Swagger as a JSON endpoint.app.UseSwagger();

// Enable middleware to serve swagger-ui (HTML, JS, CSS, etc.),// specifying the Swagger JSON endpoint.app.UseSwaggerUI(c =>{

c.SwaggerEndpoint("/swagger/v1/swagger.json", "OnionArchitecture");});#endregion

Next, we will need to add the XML File (For Swagger Documentaion). To do this, right click the WebApi Projectand go to propeties. In the Build Tab enable the XML Documentation file and give an appropriate file name andlocation. I have added the xml file to the root of the API Project.

40 Chapter 12. Sample Onion Implementation

Page 45: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

Make sure that the WebApi Project is selected as the Startup Project. Now Build / Run the Application andnavigate to ../swagger. We have got swagger up and running.

Tip #2

While running the application, you would see that it navigated to ../weatherforecast by default. This is becauseof launchSettings.json settings. In the WebApi Project, Properties drill down, you can find a launchsettings.jsonfile. This file holds all the configuration required for the app launch. Change the launch URL to swagger. Thus,swagger will open up by default every time you run the application. This helps you save some time.

12.1.3 Adding The Entities to the Domain Project

Now, let’s work on the Core Layers starting from the Domain Project. So what is the function of the DomainLayer? It basically has the models/entities, Exception, validation rules, Settings, and anything that is quite com-mon throughout the solution.

Let’s start by adding a BaseEntity class at Common/BaseEntity.cs in the Domain Project. This abstract class willbe used as a base class for our entities.

public abstract class BaseEntity{

public int Id { get; set; }}

12.1. Implementing Onion Architecture in ASP.NET Core WebApi Project 41

Page 46: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

Now add a Product Class that inherits the Id from the BaseEntity. Create a new class Entities/Product.cs in theDomain Project.

public class Product : BaseEntity{

public string Name { get; set; }public string Barcode { get; set; }public string Description { get; set; }public decimal Rate { get; set; }

}

12.1.4 Adding the Required Interfaces And Packages in Application Layer

As mentioned earlier, the Application Layer will contain the Interfaces and Types that are specific for this Appli-cation.

Firstly, Add Reference to the Domain Project.

Then, install the required packages via Console.

Install-Package MediatR.Extensions.Microsoft.DependencyInjectionInstall-Package Microsoft.EntityFrameworkCore

We have a Entity named Product. Now we need to establish this class as a Table using Entity Framework Core. Sowe will need a ApplicationDBContext. But the catch is that, we won’t create the actual concrete implementationof the ApplicationDbContext here in the Application Layer. Rather, we will just add a IApplicatoinDbContextInterface so that the EF Logics does not fall under the Application Layer, but goes to the Persistence layer whichis outside the core,

This is how you can invert the dependencies to build scalable applications. Now , the advantage is that, tommorow,you need a different implementation of the ApplicationDbContext, you don’t need to touch the existing code base,but just add another Infrastructure layer for this purpose and implement the IApplicationDbContext. As simple asthat.

Create a new folder Interfaces in the Application Project. Add a new interface in it, IApplicationDbContext

public interface IApplicationDbContext{

DbSet<Product> Products { get; set; }Task<int> SaveChanges();

}

This is another variant that i have noticed in many huge solutions. Let’s say you have around 100 interfaces and100 implementations. Do you add all this 100 lines of code to the Startup.cs to register them in the container?That would be insane in the maintainability point of view. To keep things clean, what we can do is, Createa DependencyInjection static Class for every layer of the solution and only add the corresponding . requiredservices to the corresponding Class.

In this way, we are decentralizing the code lines and keeping our Startup class neat and tidy. Here is an extensionmethod over the IServiceCollection.

public static class DependencyInjection{

public static void AddApplication(this IServiceCollection services){

services.AddMediatR(Assembly.GetExecutingAssembly());}

}

Here we will just Add Mediator to the service collection. We will implement Mediator pattern later in this tutorial.

And all you have to do in the WebApi’s Startup class in just add one line. This essentially registers all the servicesassociated with the Application Layer into the container. Quite handy, yeah?

42 Chapter 12. Sample Onion Implementation

Page 47: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

services.AddApplication();

12.1.5 Implementing MediatR for CRUD Operations

In Application Layer, Create a New Folder called Features. This will have all the logics related to each Feature /Entity. Under this folder, add a new one and name it ProductFeatures. Then add a Commands and Queries folderto it.

I have already written a detailed article on MediatR and CQRS pattern in ASP.NET Core 3.1 WebApi Project. Youcan follow that article and add the Required Commands and Handlers to the Application Layer.

I will add the links to the source code of each file. Basically these 5 Classes would cover our CRUD Opera-tions implementation. Make sure that you have gone through my article about CQRS for ASP.NET Core beforeproceeding.

• CreateCommand

• DeleteCommand

• UpdateCommand

• GetAllQuery

• GetByIdQuery

12.1.6 Setting Up EF Core on the Persistence Project

Firstly, add a connection string to the appsettings.json found in the WebApi Project.

{"ConnectionStrings":{

"DefaultConnection": "Server=(localdb)\\mssqllocaldb;Database=onionDb;→˓Trusted_Connection=True;MultipleActiveResultSets=true"

}}

With the CRUD logics out of the ways, let’s setup EFCore in the Persistence Layer and try to generate a database.Install the following packages to the Persistence Project.

Install-Package Microsoft.EntityFrameworkCoreInstall-Package Microsoft.EntityFrameworkCore.SqlServer

Remember we created an IApplicationDBContext Interface in the Application Layer? This is where we will beimplementing it. Create a new folder named Context and add a new class ApplicationDbContext. This class willimplement IApplicationDBContext.

public class ApplicationDbContext : DbContext, IApplicationDbContext{

public ApplicationDbContext(DbContextOptions<ApplicationDbContext> options)

(continues on next page)

12.1. Implementing Onion Architecture in ASP.NET Core WebApi Project 43

Page 48: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

(continued from previous page)

: base(options){}public DbSet<Product> Products { get; set; }public async Task<int> SaveChanges(){

return await base.SaveChangesAsync();}

}

We will have to register IApplicationDBContext and bind it to ApplicationDbContext, right? Similar to the Ap-plication layer, we will have to create a new class just to register the dependencies and services of this layer to theservice container.

Add a new static class, DependencyInjection

public static class DependencyInjection{

public static void AddPersistence(this IServiceCollection services,→˓IConfiguration configuration)

{services.AddDbContext<ApplicationDbContext>(options =>

options.UseSqlServer(configuration.GetConnectionString("DefaultConnection"),b => b.MigrationsAssembly(typeof(ApplicationDbContext).Assembly.

→˓FullName)));services.AddScoped<IApplicationDbContext>(provider => provider.GetService

→˓<ApplicationDbContext>());}

}

And in the Startup class/ ConfigureServices method of the WebApi Just Add the following line. You can now seethe advantage of this kind of approach.

services.AddPersistence(Configuration);

12.1.7 Generate the Migrations and the Database

As our ApplicationDbContext is configured, let’s generate the migrations and ultimately create a Database usingEf Core Tools – Code First Approach.

Install the following packages in the WebApi Project.

Install-Package Microsoft.EntityFrameworkCore.ToolsInstall-Package Microsoft.EntityFrameworkCore.Design

Now, open up the package manager console and select the Persistence project as the default prject (as mentioned inthe sceenshot below.). This is because the actual ApplicationDBContext is implemented in the Persistence layer,remember?

Then, run the following commands to add migrations and to generate / update the database.

add-migration Initialupdate-database

44 Chapter 12. Sample Onion Implementation

Page 49: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

You will get a ‘Done’ message.

12.1.8 Adding API Versioning

Just to make our solution a bit more clean, let’s also add API Versioning to the WebAPI.

I have written a detailed article on API Versioning in ASP.NET Core 3.1 WebApi. Feel feel to read it to get acomplete idea of this concept.

Install the required package.

Install-Package Microsoft.AspNetCore.Mvc.Versioning

In the Startup/ConfigureServices of the API project, add these lines to register the Versioning.

#region API Versioning// Add API Versioning to the Projectservices.AddApiVersioning(config =>{

// Specify the default API Version as 1.0config.DefaultApiVersion = new ApiVersion(1, 0);// If the client hasn't specified the API version in the request, use the

→˓default API version numberconfig.AssumeDefaultVersionWhenUnspecified = true;// Advertise the API versions supported for the particular endpointconfig.ReportApiVersions = true;

});#endregion

12.1.9 Setting up the Controllers

This is the final step of setting up Onion Architecture In ASP.NET Core. We will have to wire up a controller tothe Application Layer.

Create a Base Api Controller. This will be an Empty API Controller which will have Api Versioning enabled inthe Attribute and also a MediatR object. What is aim of this Base Controller? It is just to reduce the lines of code.Say, we add a new controller. We will not have to re-define the API Versioning route nor the Mediatr object. Butwe will just add the BaseAPI Controller as the base class. Get it? I will show it in implementation.

Add new Empty API Controller in the Controllers folder and name it BaseApiController.

using MediatR;using Microsoft.AspNetCore.Http;using Microsoft.AspNetCore.Mvc;using Microsoft.Extensions.DependencyInjection;namespace WebApi.Controllers{

[ApiController][Route("api/v{version:apiVersion}/[controller]")]public abstract class BaseApiController : ControllerBase{

(continues on next page)

12.1. Implementing Onion Architecture in ASP.NET Core WebApi Project 45

Page 50: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

(continued from previous page)

private IMediator _mediator;protected IMediator Mediator => _mediator ??= HttpContext.RequestServices.

→˓GetService<IMediator>();}

}

You can see that we are adding the API Versioning data to the route attribute and also creating a IMediator object.

Next, let’s create our actual ENtity endpoint. Create a new folder inside the Controllers folder and name it ‘v1’.This means that this folder will contain all the Version 1 API Controllers. Read more about API Versioning tounderstand the need for this here.

Inside the v1 Folder, add a new empty API Controller named ProductController. Since this is a very basic con-troller that calls the mediator object, I will not go in deep. However, I have previously written a detailed articleon CQRS implementation in ASP.NET Core 3.1 API. You could go through that article which covers the samescenario. Read it here.

[ApiVersion("1.0")]public class ProductController : BaseApiController{

/// <summary>/// Creates a New Product./// </summary>/// <param name="command"></param>/// <returns></returns>[HttpPost]public async Task<IActionResult> Create(CreateProductCommand command){

return Ok(await Mediator.Send(command));}/// <summary>/// Gets all Products./// </summary>/// <returns></returns>[HttpGet]public async Task<IActionResult> GetAll(){

return Ok(await Mediator.Send(new GetAllProductsQuery()));}/// <summary>/// Gets Product Entity by Id./// </summary>/// <param name="id"></param>/// <returns></returns>[HttpGet("{id}")]public async Task<IActionResult> GetById(int id){

return Ok(await Mediator.Send(new GetProductByIdQuery { Id = id }));}/// <summary>/// Deletes Product Entity based on Id./// </summary>/// <param name="id"></param>/// <returns></returns>[HttpDelete("{id}")]public async Task<IActionResult> Delete(int id){

return Ok(await Mediator.Send(new DeleteProductByIdCommand { Id = id }));}/// <summary>/// Updates the Product Entity based on Id.

(continues on next page)

46 Chapter 12. Sample Onion Implementation

Page 51: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

(continued from previous page)

/// </summary>/// <param name="id"></param>/// <param name="command"></param>/// <returns></returns>[HttpPut("[action]")]public async Task<IActionResult> Update(int id, UpdateProductCommand command){

if (id != command.Id){

return BadRequest();}return Ok(await Mediator.Send(command));

}}

That’s quite everything in this simple yet powerful implementation of Onion Architecture in ASP.NET Core. Buildthe application and let’s test it.

Since we are already talking about a form of Clean Architecture in ASP.NET Core Applications, it would help ifyou read about certain tips to write clean and scalable C# Code. This knowledge will drastically improve the wayyou start building applications in .NET – Read the article here (20 Tips to write Clean C# Code)

12.2 Testing

Run the application and open up Swagger. We will do a simple test to ensure that our solution works. I will justcreate a new product and make a request to query all the existing products as well.

You can see that we receive the expected data.

12.2. Testing 47

Page 52: Freight Rate Documentation

Freight Rate Documentation, Release 0.0.1

48 Chapter 12. Sample Onion Implementation

Page 53: Freight Rate Documentation

CHAPTER

THIRTEEN

INDICES AND TABLES

• genindex

• modindex

• search

49