TRUNGTQ

Think Big, Act Small, Fail Fast and Learn Rapidly

NAVIGATION - SEARCH

VITA – A Powerful and Flexible ORM and Additional Building Blocks for .NET Applications

Going over key features of the VITA open source ORM and .net application framework, with sample SPAs using the MVC/AngularJS/WebApi/VITA technology stack.

Background

VITA is an open source ORM and .net application framework developed by Roman Ivantsov.  Some of you may know Roman as the developer of the powerful Irony parser and .net language implementation kit.  I’ve used Irony to great effect on a couple of my projects.

You could say that I am or was an Entity Framework guy.  I have used Entity Framework on a fair number of projects, mostly not by choice.  Though I’ve been able to utilize EF to get the job done, I’ve long had several frustrations utilizing EF for more performant and scalable applications.  I have been on the hunt for a more usable framework, having dabbled with ORMs such as NHibernate and micro ORMs such as Dapper.

Late last year I caught wind of VITA and found that framework to have the most potential for the needs I am looking for, even motivating me enough to write this article!

Why VITA?

Why would you consider VITA and read further into this article and dig into it?  There are plenty of other ORMs out there, for example.  For me, two reasons peaked my interest as I dug deeper:

  • ORM- As a whole, I believe VITA stacks up well against the major full-featured ORMs such as Entity Framework and NHibernate.  In this article, I will only cover VITA's basic features.  I won't compare VITA against other ORMs, though I may do so in a follow up article.
  • Building Blocks - You need more than an ORM to create real world data-connected .net applications, and VITA provides a number of different building blocks that you can easily choose to add features that are typically needed in such applications.  I will cover the basic features of these building blocks in this article.

For me, the combination of an effective ORM and very useful building blocks (and how this was thought out and designed from the very beginning) makes VITA a unique offering.

Introduction to VITA

I am going to briefly outline key VITA features that will be covered or mentioned in this article.  Please see the documentation and examples at the VITA github site for more comprehensive information on VITA’s features and capabilities.

Full-featured ORM

VITA at its core is a full-featured ORM (as opposed to a lightweight ORM like Dapper), and includes many if not all of the features you would expect from an ORM.  Some key features include:

  • Data model management – For defining your data model and managing the corresponding database schemas:
    • Entities (data mapped objects) – You can easily define your data model in code with .net interfaces, with no visual designer or complicated mapping.  Your schema is created automatically in a code first approach (db first approach is supported as well).
    • Entity and property attributes – You can utilize a wide variety of entity or property level attributes to allow you to tailor your data model and associated schema (tables, columns, relationships, indexes, etc.) exactly as you need it.
    • Automatic schema updates – Schema updates are managed automatically for you (preserving  data) in a code first approach, based on your current data model (entity interfaces).  See Continuing On with VITA section below for utilizing existing database information for a db first approach.
    • Database types – You can choose to implement your data model with any of a number of supported database types including SQL Server, MySQL, PostgreSQL, SQLite.
    • Computed properties and views  – In addition to entities and properties that define your schema, you can define any number of computed properties and views to transform your data as you need to for application use.
  • Data management and access – For providing the ability to manage and access your data in many conceivable ways:
    • Stored procedures – Stored procedures for CRUD operations are automatically generated and updated for you. You can use SQLs (automatically generated too) if you prefer or if database does not support Stored Procedures (SQL CE, SQLite).
    • Automatic and Lazy loading of related entities and related lists – Full support for 1-to-many and many-to-many relations.
    • LINQ - Full LINQ support to query your data. Translated LINQ queries are cached for efficiency
    • Self-tracking data – Creating and updating your data is a snap, as your data is self-registered for updates that are organized into transactions.  New/existing keys and original/modified properties are automatically maintained for you during the editing process.
    • Data Caching – data caching is supported directly, including full-table cache with automatic LINQ queries execution against in-memory tables.
  • Packaged components– All of your data management functionality is neatly packaged into an entity module  that can be separately tested and used in several applications.

Web Api support

In addition to being a full-featured ORM, VITA is fully integrated with WebApi, allowing you to easily create effective RESTful data services.  Some key features include:

  • Web call context– Easy access to web related parameters and response information.
  • Sessions– Full support for public and secure web sessions.
  • Client faults– Extensions to easily provide HTTP friendly response codes and messaging based on validation or other issues.
  • Base api controller– An easy to use building block for you to build VITA enhanced RESTful data services.  This building block provides features listed above and integration with other services such as error logging.

Slim Api

VITA now provides an easier and more streamlined way to provide RESTful web services for your applications.  You can easily add slim api controllers in your core library (with no ASP.NET WebApi package dependences) and incorporate business logic in a central place.  Then, it is a snap to utilize slim api services in your UI applications.

Standard Entity Modules

In addition to building applications with your core data model, you can add to your applications any number of built in modules that automatically provide some powerful capabilities.  These include:

  • Logging modules– This module provides a seamless way to add automatic logging facilities to your applications.  You can choose any number of specific types of logging such as errors, SQL operations and transactions, web api calls, and general incidents.  Includes ability to automatically add tracking columns for transaction logging.  Persistent sessions are now included in the logging module.
  • Login – This module provides advanced signup and login functionality.  You can easy integrate login with your own core user or login table for enhanced registration, tracking, etc.

Authorization Framework

This powerful framework provides the means to define and implement all of the critical authorization rules that determine who can access what data.  Authorization rules are easily configured with entity resources, data filters, permissions, activities, and user roles.  Rules can be configured utilizing a wide variety of access types and can even be defined to the property level.

Learning VITA by example

I will walk you through various features of VITA with working examples that you can download and play around with.  Each of the examples is an MVC/AngularJS SPA which makes WebApi calls which in turn utilize the VITA framework.  The examples use MS SQL databases, though you could easily switch to another supported database type as you wish.

Below is a VITA basics section followed by two digging deeper sections.  Each section has a separate example solution and download.  This is organized so that you can read a section and choose to play around with VITA before coming back to the next section to learn more.

The code presented in this article will likely be a condensed version of the code found in the example downloads, so bring up the example applications to have a more complete view of what is going on.

VITA basics

To understand the VITA basics, the goal in this section is to use the minimum of core VITA features to get a very simple set of tests and a very basic web application up and running.  Follow along with the Basics ExampleSolution download, which has the following projects:

  • Basic.VITA– The core VITA managed data model and related packaging.
  • Basic.Tests – Some basic CRUD tests for exercising your VITA managed data model.  Create a test MS SQL database matching the DbConnectionString config value.
  • Basic.DS – A library of web api services and related materials.
  • Basic.UI – An MVC/AngularJS single page web application utilizing the web api services.  Create an MS SQL database matching the DbConnectionString config value.

Basic example

Our very basic example just consists of buildings and rooms, and we want the ability to manage basic building and room data.  Our desired database schema looks like the following:

Data model

Let’s create a data model and package it up for application use (see Basic.VITA project for details).  Defining a data model to produce the above schema is very simple.  We need to:

  • Define an entity interfacefor each table.  The interface must have the Entityattribute to be a valid data model interface.  We will create an interface for Buildingand Room.
  • Define basic table columns as entity interface properties.  We will add all of the non-foreign key properties above.
  • Define a primary key for each entity.  We will have Guid primary keys, and will add the PrimaryKeyand Autoattributes to that property for auto generated Guid keys.
  • Define the relationship on both ends, a (get only) list of rooms for a building and a reference to a building for a room.

Below are the interfaces for this data model:

[Entity]
public interface IRoom
{
    [PrimaryKey, Auto]
    Guid Id { get; set; }

    int Number { get; set; }

    IBuilding Building { get; set; }
}
[Entity]
public interface IBuilding
{
    [PrimaryKey, Auto]
    Guid Id { get; set; }

    string Name { get; set; }

    IList<IRoom> BuildingRooms { get; }
}
 

That’s it!  VITA has default rules for mapping the types and names in your interfaces to the schema.  You can easily override any of these default rules, which we will cover in a later section.

In my opinion, ORMs that have you define your data model with concrete classes are behind the times.  In this day and age, we need to build real world applications with scalable and loosely coupled architectures.  We need to utilize our data as part of mockable and dependency injectable services, factories, and other libraries.  This is so much easier to do if our core data model is described in terms of interfaces!  With interfaces, you don’t get exposed to the complexities of the ORM implementation, and you don’t start packing non-data related details into your data model.

We will start accessing data for our little data model shortly, and demonstrate how the BuildingRoomsproperty will be automatically populated for you based on the Buildingreference in IRoom.  But first, we need to do a little bit of configuration work.

Entity module and application

OK, we have a data model, now what do we do with it?  VITA provides an effective means for packaging up your data related functionality for easy use in your applications.

To begin, you create one or more EntityModules.  An EntityModuleis essentially a self-contained group of related entities.  Setting up an entity module is essentially registering your group of related entities to an EntityArea, which we have done below by registering our two entities:

public class DomainModule: EntityModule
{
    public DomainModule(EntityArea area) : base(area, "DomainModule")
    {
        area.RegisterEntities( typeof(IBuilding)
            , typeof(IRoom));
    }
}
 

So, what is an EntityArea?  An EntityAreais essentially a group of entities (one or more modules) registered to a database schema.  We will set one up in a second.

To effectively use all of our data related functionality, we package everything up into an EntityApp.  In our DomainApp, we add one area (with area name and schema name), and set up our module with that area.  You can certainly define and use multiple areas and modules in your entity application as you see fit.

public class DomainApp: EntityApp
{
    public DomainApp()
    {
        var domainArea = this.AddArea("Domain");
        var mainModule = new DomainModule(domainArea);
    }
}
 

In our example, we wrapped all of this up into a class library (see Basic.VITAproject for details), so that we can easily configure and use applications with VITA managed entities.

Accessing and managing your data

Now we want to use our handy little DomainAppcomponent, create our schema, and do some actual data operations!  Let’s look over how to set up some tests to perform these operations (see Basic.Tests project for details).

In order to run some tests, we need to set up our DomainAppfor actual use.  Setting up an EntityAppfor use is just 3 simple steps: create, initialize, and connect (to your database).  For the tests, I decided to set up a base test class to set up the DomainAppin the beginning of the test run (of course you can choose to do this for every test):

    [TestClass]
    public abstract class BaseUnitTest
    {
        [AssemblyInitialize]
        public static void Initialize(TestContext testContext)
        {
            // set up application
            var protectedSection = (NameValueCollection)ConfigurationManager.GetSection("protected");
            DomainApp = new DomainApp();
            DomainApp.Init();
            var connString = protectedSection["MsSqlConnectionString"];
            var driver = MsSqlDbDriver.Create(connString);
            var dbOptions = MsSqlDbDriver.DefaultMsSqlDbOptions;
            var dbSettings = new DbSettings(driver, dbOptions, connString, modelUpdateMode: DbModelUpdateMode.Always);
            DomainApp.ConnectTo(dbSettings);
        }

        protected static DomainApp DomainApp { get; set; }
    }


When an EntityAppis initialized and connected, your database (schema, stored procedures, etc.) will be created/updated.  Examine that these items are created in your database.

Now examine the following test which performs some CRUD operations with IRoominstances:

[TestMethod]
public void RoomCRUDTest()
{
    IEntitySession session1 = DomainApp.OpenSession();
    IEntitySession session2;

    // create Room
    IRoom room1 = session1.NewEntity<IRoom>();
    room1.Number = 221;
    room1.Building = session1.NewEntity<IBuilding>();
    room1.Building.Name = "Building 1";

    session1.SaveChanges();
    Assert.IsNotNull(room1, "Create and save of IRoom item failed.");

    // read Room
    session2 = DomainApp.OpenSession();
    IRoom room2 = session2.GetEntity<IRoom>(room1.Id);
    Assert.IsNotNull(room2, "Retrieval of new IRoom item failed.");
    Assert.IsTrue(RoomTest.CompareItems(room1, room2), "Retrieved IRoom item match with created item failed.");

    // search Room
    session2 = DomainApp.OpenSession();
    room2 = (from i in session2.EntitySet<IRoom>()
             where i.Number == room1.Number
             select i).FirstOrDefault();
    Assert.IsNotNull(room2, "Search of new IRoom item failed.");

    // update Room
    room1.Number = 222;
    room1.Building.Name = "Building 1a";
    session1.SaveChanges();
    session2 = DomainApp.OpenSession();
    room2 = session2.GetEntity<IRoom>(room1.Id);
    Assert.IsNotNull(room2, "Retrieval of updated IRoom item failed.");
    Assert.IsTrue(RoomTest.CompareItems(room1, room2), "Retrieved IRoom item match with updated item failed.");

    // delete Room
    session1.DeleteEntity<IBuilding>(room1.Building);
    session1.DeleteEntity<IRoom>(room1);
    session1.SaveChanges();
    session2 = DomainApp.OpenSession();
    room2 = session2.GetEntity<IRoom>(room1.Id);
    Assert.IsNull(room2, "Delete of IRoom item failed.");
}
 

We start off by opening a session for our DomainAppthat we will use to perform CRUD operations.  We use a second session in the test to verify update operations.

Creating an item

See the “create Room” portion of the test above.  We use the NewEntitycall to create an IRoomthat will be tracked as a created (new) item:

IRoom room1 = session1.NewEntity<IRoom>();

You can populate your IRoomas you wish, and then save changes for your session.  If you are in the debugger, you will notice that before the save, the IRoomis in the “New” state and has an automatically generated Guid, and after the save the IRoomis in the “Loaded” state.

You will notice that an IBuildingwas also created:

room1.Building = session1.NewEntity<IBuilding>();

All we had to do is set the reference to the new IBuilding.  Changes are tracked for us, and details such as foreign key values are handled for us.  Easy!

If you run into issues in the edited data, reverting changes in your session is easy, you just need to call CancelChanges().

Retrieving an item

See the “read Room” portion of the test above.  Getting an item by primary key is easy with the GetEntitycall. In the debugger, you will notice that the retrieved IRoomis in the “Loaded” state.

Searching for an item

See the “search Room” portion of the test above.  This is just a simple LINQ query to retrieve an IRoomby room number.  With full LINQ support, you can build up complex queries, with joins and groupings, paging, etc.

Updating an item

See the “update Room” portion of the test above.  Since your entities are self-tracking, all we have to do is update whatever properties we want (including references).  When you are done, save changes for your session.

In the debugger, you will notice that the updated IRoomis in the “Modified” state before saving changes and “Loaded” state after the save.

Deleting an item

See the “delete Room” portion of the test above.  Marking items for deletion is simple by using the DeleteEntitycall:

session1.DeleteEntity<IBuilding>(room1.Building);
session1.DeleteEntity<IRoom>(room1);

As before, save changes when you are done.  In the debugger, you will notice that the updated IRoom(and associated IBuilding) is in the “Deleting” state before saving changes and “Fantom” state after the save.

Handling changes to your database

Take a look at the generated schema in your database if you haven’t already done so.  Now we want to make some changes.

Go ahead and add some properties to your data model.  In my case, I added a Nameand a Capacityto IRoom:

[Entity]
public interface IRoom
{
    [PrimaryKey, Auto]
    Guid Id { get; set; }

    int Number { get; set; }

    string Name { get; set; }

    int Capacity { get; set; }

    IBuilding Building { get; set; }
}
 

I made corresponding additions in the “create Room” test to populate these new properties:

room1.Name = "My Room";
room1.Capacity = 77;

Go ahead and rerun the tests.  During the test run, when the EntityAppis initialized and connected, the schema changes (additions) are applied automatically, and the tests pass.

Go ahead and make a couple of additions and deletions (of your additions) to your entities and rerun the tests.  VITA will automatically apply your changes.  The only issue you might run into is if a change violates a constraint in your existing data.  We will address how to deal with that later.

Go ahead and revert all of your changes before moving forward.

RESTful web services

The VITA framework is fully integrated with MS WebApi framework, and we want to create some RESTful services that we can use wherever we need to.  Let’s start building some services in a data services project (see Basic.DS project for details).  Note that we will cover slim api in the next example.

Api controllers

We want to build an api controller for each entity in our data model to provide the services.  This is a simple process using VITA’s BaseApiController.  We will walk through creating a controller for IRoom:

public class RoomsController : BaseApiController
{
}
 

Examine the following method to search for IRoom items:

    [HttpGet, Route("api/rooms")]
        public QueryResults<RoomDto> GetRooms([FromUri] RoomQuery query)
        {
            var session = OpenSession();
            query = query ?? new RoomQuery();
            if (query.Take == 0) query.Take = 10;
            
            // Build where clause
            Guid buildingId;
            Guid.TryParse(query.BuildingId, out buildingId);
            var where = session.NewPredicate<IRoom>();
            where = where
                .AndIf(query.Number != 0, i => i.Number == query.Number.Value)
                .AndIf(buildingId != Guid.Empty, i => i.Building.Id == buildingId);
            
            // Build order by
            Dictionary<string, string> orderByMapping = new Dictionary<string, string>(StringComparer.InvariantCultureIgnoreCase)
            {
                { "id", "Id" },
                { "number", "Number" },
                { "building_name", "Building.Name" },
            };
            
            QueryResults<RoomDto> results = new QueryResults<RoomDto>(session.ExecuteSearch(where, query, i => i.ToDto(), null, orderByMapping));
            results.CanCreate = true;
            
            return results;
        }
 

VITA provides a convenient ExecuteSearch method to provide paginated data based on your input criteria.  The first step is to use NewPredicate<IRoom> to build up a where clause with optional criteria in the RoomQuery. The predicate building provides a number of extension methods such as True and AndIfNotEmptyfor building up your where clause.  Next, build up a dictionary of order by cases, where the dictionary key is the order by name and value is the order by property (can be deep properties such as Building.Name).  Then, calling ExecuteSearch provides a page of ordered results (the results includes the total items in the search).  Note that the input query has a order by property that specifies which property to order by.  The order property can be a column delimited list of names in the dictionary (appending -desc to a name specifies descending sort).

I’m not going to show the details here, but in the example you will find additional support classes for transferring and providing data:

  • Query classes (such as RoomQuery) – These classes merely make it simpler to pass in optional search criteria for search methods, used in the query parameter above.
  • Dto classes (such as RoomDto) – These classes are a concrete representation of data model entities.  The properties essentially match those in the corresponding interfaces (such as IRoom).
  • Dto extensions (such as RoomDtoExtensions) – These extensions are used to make it easier to transfer data from VITA into the dto object equivalents, as seen in the ToDtocall above.

Examine the following controller method to get an IRoom item:

   [HttpGet, Route("api/rooms/{id}")]
        public RoomDto GetRoom(Guid id)
        {
            var session = OpenSession();
            var item = session.GetEntity<IRoom>(id);
            if (item == null)
            {
                WebContext.CustomResponseStatus = HttpStatusCode.BadRequest;
                WebContext.ResponseBody = String.Format("Room with ID '{0}' not found.", id);
                return null; 
            }
            RoomDto itemDto = item.ToDto(true);
            Type[] blockingEntities;
            itemDto.CanDelete = itemDto.CanDelete && session.CanDeleteEntity<IRoom>(item, out blockingEntities);
            return itemDto;
        }
 

Here we use GetEntity<IRoom> to get an IRoom by id.  The built-in WebContextproperty and extension methods makes it easy to provide an HTTP friendly response with expected error codes and messaging.

One feature to point out here is the CanDeleteEntity<IRoom> call.  This built in method checks if there are any foreign key references that would block the successful deletion of this item.  I created the CanDeleteproperty that can be utilized by the UI to disable delete if it will cause foreign key violations.

Examine the following controller methods to create and update IRoom items:

 [HttpPost, Route("api/rooms")]
        public RoomDto CreateRoom(RoomDto item)
        {
            return CreateUpdateRoom(item, create: true);
        }
        
        [HttpPut, Route("api/rooms")]
        public RoomDto UpdateRoom(RoomDto item)
        {
            return CreateUpdateRoom(item, create: false);
        }
        
        private RoomDto CreateUpdateRoom(RoomDto item, bool create)
        {
            var session = OpenSession();
            item.Validate(OpContext);
            OpContext.ThrowValidation(); //will throw if any faults had been detected; will return BadRequest with list of faults in the body
            IRoom updateItem;
            if (create)
            {
                updateItem = session.NewEntity<IRoom>();
            }
            else
            {
                updateItem = session.GetEntity<IRoom>(item.Id);
                OpContext.ThrowIfNull(updateItem, ClientFaultCodes.ObjectNotFound, "Room", "Room with ID '{0}' not found.", item.Id);
            }
            if (create)
            {
                
            }
            updateItem.Number = item.Number;
            updateItem.Building = session.GetEntity<IBuilding>(item.BuildingId);
            session.SaveChanges();
            return updateItem.ToDto(true);
        }
 

As with the test example, we use NewEntity<IRoom> or GetEntity<IRoom> to create or get an IRoom to update.  VITA provides extensions to the built in operation context OpContext to make it easy to manage and provide a response for client faults.  The dto class Validate()method makes use of some validation extensions (the validation below is for an IBuilding):

public void Validate(OperationContext context)
{
    context.ValidateNotEmpty(Name, "Name", "Name may not be empty.");
    context.ValidateMaxLength(Name, 50, "Name", "Name text is too long.");
}
 

The ThrowValidation()extension will throw if any faults have been detected, and return a BadRequest with the list of faults.  You can also throw specific client faults with extensions such as ThrowIfNull().  Here we throw a fault if an item requested to be updated is not found.

Just as with the test example, call SaveChanges() to save the changes to the self-tracking entities, and CancelChanges() if you need to.

Finally, examine the following controller method to delete an IRoom item:

        [HttpDelete, Route("api/rooms/{id}")]
        public void DeleteRoom(Guid id)
        {
            var session = OpenSession();
            var item = session.GetEntity<IRoom>(id);
            OpContext.ThrowIfNull(item, ClientFaultCodes.ObjectNotFound, "Room", "Room with ID '{0}' not found.", id);
            session.DeleteEntity(item);
            session.SaveChanges();
        }
        }


As with the test example, we use DeleteEntity() to delete an IRoom, and SaveChanges() to perform the actual delete.  As with the update case, we a throw a client fault if an item requested to be deleted is not found.

Review the overall api controllers and support classes in the example download.

Packaging and configuration

It’s a good idea to package up your web api services into a class library that can be configured and used as you need to.  The core code to configure the services is as follows:

   public static class DomainWebApiConfig
    {
        public static void Register(HttpConfiguration <a>config</a>

Notice that the setting up of the EntityAppis virtually identical to the test example.

AngularJS application

This article isn’t about building UI applications, but we need to build a basic one to be able to really show VITA’s features, particularly WebApi supporting ones.  I will show you some snippets to give you an idea of the application structure and leave it to you to dig deeper.

The UI application is a single page application for basic administration.  The key elements of this application are:

  • MVC home controller/view– This is an mvc application with a home controller to present the home page or main view.  This didn’t need to be an MVC app as Angular will be used to do all of the work, but I thought it was a good idea to do in case you would ever need a traditional MVC controller for some functions.  Angular works well with these controllers too.
  • AngularJS module – The overall module to manage the application with angular controllers, services, and (template) views, etc.
  • AngularJS controllers, services, and templates – Each entity in our data model has a controller, a service, and set of templates to provide UI functionality to administer that entity.  The angular services will make the web api calls to our VITA data services which in turn will manage the data.

Let’s start going through some UI code (see Basic.UI project for details).

Utilizing our web api services

To utilize our class library web api services, we just need to register our DomainWebApiConfigduring the start of our MvcApplication(global.asax):

public class MvcApplication : System.Web.HttpApplication
{
    protected void Application_Start()
    {
        DomainWebApiConfig.Register(GlobalConfiguration.Configuration);

        RouteConfig.RegisterRoutes(RouteTable.Routes);
        BundleConfig.RegisterBundles(BundleTable.Bundles);
    }
}
 

Angular module

In building Angular SPAs, I personally love the ui routing and state manage features that allow you to easily manage and update multiple views.  So, this is the approach taken with each of the example applications.

Following is a condensed view of the angular module (DomainApp.js) with just roomrelated information :

var DomainApp = angular.module('DomainApp', ['ui.router', 'ui.bootstrap', 'angularValidator', 'ngCookies']);

// add controllers, services, and factories
DomainApp.controller('RoomsController', RoomsController);
DomainApp.service('RoomsService', RoomsService);

var configFunction = function ($stateProvider, $httpProvider, $locationProvider) {
   
    $stateProvider
        .state('roomSearch', {
            url: '/rooms?number&buildingId&orderBy&descending&page&pageSize',
            views: {
                "searchView": {
                    templateUrl: '/Templates/rooms/Search.html',
                    controller: RoomsController
                }
            }
        })
        .state('roomResults', {
            url: '/rooms/Results?number&buildingId&orderBy&descending&page&pageSize',
            views: {
                "detailView": {
                    templateUrl: '/Templates/rooms/Results.html',
                    controller: RoomsController
                }
            }
        })
        .state('roomGet', {
            url: '/rooms/get?id',
            views: {"detailView": {
                    templateUrl: '/Templates/rooms/Get.html',
                    controller: RoomsController
                }
            }
        })
        .state('roomCreate', {
            url: '/rooms/create?buildingId',
            views: {
                "detailView": {
                    templateUrl: '/Templates/rooms/Create.html',
                    controller: RoomsController
                }
            }
        })
        .state('roomUpdate', {
            url: '/rooms/update?id',
            views: {
                "detailView": {
                    templateUrl: '/Templates/rooms/Update.html',
                    controller: RoomsController
                }
            }
        })
        .state('roomDelete', {
            url: '/rooms/delete?id',
            views: {
                "detailView": {
                    templateUrl: '/Templates/rooms/Delete.html',
                    controller: RoomsController
                }
            }
        })
        .state('home', {
            url: '/'
        });
}
configFunction.$inject = ['$stateProvider', '$httpProvider', '$locationProvider'];

DomainApp.config(configFunction);


For this ui-routing and state management approach, there are two views in the main page that get updated: searchViewand detailView.  For each state, you define the url, and views that are updated.  For each view, you define the source for the template and the angular controller to be loaded.

Angular controller

Examine the following angular controller, which enables the views for performing CRUD operations on IRoom items.  Calls are made to the corresponding service functions to perform the operations (code below is condensed to show search only, other functions are similar):

var RoomsController = function($scope, $stateParams, $state, $window, $location, RoomsService, BuildingsService) {
    // data for search operations
    $scope.searchQuery = {
        number: Number($stateParams.number) || 0,
        buildingId: $stateParams.buildingId || "00000000-0000-0000-0000-000000000000",
        orderBy: $stateParams.orderBy || '',
        descending: $stateParams.descending || 'false',
        page: $stateParams.page || 1,
        pageSize: $stateParams.pageSize || 10,
        totalPages: 0,
        filter: 'none'
    };
    
    // data to get search results
    $scope.searchResults = {
        items: null,
        totalPages: 0,
        totalItems: 0,
        hasResults: false,
        canCreate: true
    };
    
    // data to get an item
    $scope.itemQuery = {
        id: $stateParams.id || "00000000-0000-0000-0000-000000000000",
        itemFound: false
    };
    
    // data for create, update, and delete operations
    $scope.itemForm = {
        number: 0,
        buildingId: $stateParams.buildingId || "00000000-0000-0000-0000-000000000000",
        buildings: null,
        canEdit: false,
        canDelete: false
    };
    
    // status on any operation
    $scope.status = {
        isReadOnly: false,
        isError: false,
        errorMessage: '',
        isSuccess: false,
        successMessage: ''
    };
    
    $scope.navbarProperties = {
        isCollapsed: true
    };
    
    // search api
    $scope.search = function () {
        $scope.searchQuery.filter = '';
        if ($scope.searchQuery.number != 0) {
            if ($scope.searchQuery.filter != '') {
                $scope.searchQuery.filter = $scope.searchQuery.filter + ', ';
            }
            $scope.searchQuery.filter = $scope.searchQuery.filter + 'Number: ' + $scope.searchQuery.number;
        }
        if ($scope.searchQuery.buildingId != "00000000-0000-0000-0000-000000000000") {
            if ($scope.searchQuery.filter != '') {
                $scope.searchQuery.filter = $scope.searchQuery.filter + ', ';
            }
            $scope.searchQuery.filter = $scope.searchQuery.filter + 'Building Id: ' + $scope.searchQuery.buildingId;
        }
        if ($scope.searchQuery.filter == '') {
            $scope.searchQuery.filter = 'none';
        }
        var orderBy = $scope.searchQuery.orderBy;
        if ($scope.searchQuery.descending == 'true') {
            orderBy = orderBy + '-desc';
        }
        var result = RoomsService.searchRooms($scope.searchQuery.number, $scope.searchQuery.buildingId, orderBy, $scope.searchQuery.page, $scope.searchQuery.pageSize);
        result.then(function(result) {
            if (result.isSuccess) {
                $scope.searchResults.items = result.items;
                $scope.searchResults.totalPages = Math.ceil(1.0 * result.totalItems / $scope.searchQuery.pageSize);
                $scope.searchResults.totalItems = result.totalItems;
                $scope.searchResults.hasResults = true;
                $scope.searchResults.canCreate = result.canCreate;
            } else {
                $scope.status.isError = true;
                $scope.status.isSuccess = false;
                $scope.status.errorMessage = result.message;
            }
        });
    }
    
    $scope.refreshSearch = function () {
        $state.go('roomResults', {
            'number': $scope.searchQuery.number,
            'buildingId': $scope.searchQuery.buildingId,
            'orderBy': $scope.searchQuery.orderBy,
            'descending': $scope.searchQuery.descending,
            'page': $scope.searchQuery.page,
            'pageSize': $scope.searchQuery.pageSize
        });
    }
    
    // get api
    $scope.get = function (isEdit) {
        var result = RoomsService.getRoom($scope.itemQuery.id);
        result.then(function(result) {
            if (result.isSuccess) {
                $scope.status.isSuccess = true;
                $scope.itemForm.id = result.data.Id;
                $scope.itemForm.number = result.data.Number;
                $scope.itemForm.buildingId = result.data.BuildingId;
                $scope.itemForm.canEdit = result.data.CanEdit;
                $scope.itemForm.canDelete = result.data.CanDelete;
                if (isEdit == true && $scope.itemForm.canEdit == false) {
                    $scope.status.isReadOnly = true;
                }
                $scope.init();
            } else {
                $scope.status.isError = true;
                $scope.status.isSuccess = false;
                $scope.status.errorMessage = result.message;
            }
        });
    }
    
    // create api
    $scope.create = function () {
        var result = RoomsService.createRoom($scope.itemForm.number, $scope.itemForm.buildingId);
        result.then(function(result) {
            if (result.isSuccess) {
                $scope.status.isSuccess = true;
                $scope.status.isReadOnly = true;
                $scope.status.isError = false;
                $scope.status.successMessage = "Room item successfully created."
            } else {
                $scope.status.isError = true;
                $scope.status.isSuccess = false;
                $scope.status.errorMessage = result.message;
            }
        });
    }
    
    // update api
    $scope.update = function () {
        var result = RoomsService.updateRoom($scope.itemForm.id, $scope.itemForm.number, $scope.itemForm.buildingId);
        result.then(function(result) {
            if (result.isSuccess) {
                $scope.status.isSuccess = true;
                $scope.status.isReadOnly = true;
                $scope.status.isError = false;
                $scope.status.successMessage = "Room item successfully updated."
            } else {
                $scope.status.isError = true;
                $scope.status.isSuccess = false;
                $scope.status.errorMessage = result.message;
            }
        });
    }
    
    // delete api
    $scope.delete = function () {
        var result = RoomsService.deleteRoom($scope.itemQuery.id);
        result.then(function(result) {
            if (result.isSuccess) {
                $scope.status.isSuccess = true;
                $scope.status.isReadOnly = true;
                $scope.status.isError = false;
                $scope.status.successMessage = "Room item successfully deleted."
            } else {
                $scope.status.isError = true;
                $scope.status.isSuccess = false;
                $scope.status.errorMessage = result.message;
            }
        });
    }
}

RoomsController.$inject = ['$scope', '$stateParams', '$state', '$window', '$location', 'RoomsService', 'BuildingsService'];


Angular service

Examine the following angular service, which has functions that perform the CRUD operations on IRoom items by calling the VITA enhanced web api services.  Success and error response information for each function is then returned (note that you have easy access to response data, status, headers, and config info):

var RoomsService = function ($http, $q) {
    this.searchRooms = function (number, buildingId, orderBy, page, pageSize) {
        var deferredObject = $q.defer();
        var results = {
            isSuccess: true,
            message: '',
            items: null,
            totalItems: 0,
            canCreate: true
        }
        var searchQuery = {
            Number: number,
            BuildingId: buildingId,
            OrderBy: orderBy,
            Skip: (page - 1) * pageSize,
            Take: pageSize
        };
        if (searchQuery.Skip < 0) searchQuery.Skip = 0;
        
        $http.get('/api/rooms', { params: searchQuery }).
            success(function (data) {
                results.items = data.Results;
                results.totalItems = data.TotalCount;
                results.canCreate = data.CanCreate;
                deferredObject.resolve(results);
            }).
            error(function (data, status, headers, config) {
                results.isSuccess = false;
                results.message = 'Could not search for Room items: ';
                if (typeof data == "string") {
                    results.message = results.message + ' ' + data;
                } else {
                    for (var i = 0; i < data.length; i++) {
                        results.message = results.message + ' ' + data[i].Message;
                    }
                }
                deferredObject.resolve(results);
            });
        
        return deferredObject.promise;
    };
    
    this.getRoom = function (id) {
        var deferredObject = $q.defer();
        var results = {
            isSuccess: true,
            message: '',
            data: null
        }
        
        $http.get('/api/rooms/' + id).
            success(function (data) {
                results.data = data;
                deferredObject.resolve(results);
            }).
            error(function (data, status, headers, config) {
                results.isSuccess = false;
                results.message = 'Could not get Room item:';
                if (typeof data == "string") {
                    results.message = results.message + ' ' + data;
                } else {
                    for (var i = 0; i < data.length; i++) {
                        results.message = results.message + ' ' + data[i].Message;
                    }
                }
                deferredObject.resolve(results);
            });
        
        return deferredObject.promise;
    };
    
    this.listRoom = function (id) {
        var deferredObject = $q.defer();
        var results = {
            isSuccess: true,
            message: '',
            data: null
        }
        
        $http.get('/api/roomslist', { params: { take: 100, id: id } }).
            success(function (data) {
                results.data = data;
                deferredObject.resolve(results);
            }).
            error(function (data, status, headers, config) {
                results.isSuccess = false;
                results.message = 'Could not get Room list:';
                if (typeof data == "string") {
                    results.message = results.message + ' ' + data;
                } else {
                    for (var i = 0; i < data.length; i++) {
                        results.message = results.message + ' ' + data[i].Message;
                    }
                }
                deferredObject.resolve(results);
            });
        
        return deferredObject.promise;
    };
    
    this.createRoom = function (number, buildingId) {
        var deferredObject = $q.defer();
        var results = {
            isSuccess: true,
            message: '',
            data: null
        }
        var itemData = {
            Number: number, 
            BuildingId: buildingId
        };
        
        $http.post('/api/rooms', itemData).
            success(function (data) {
                results.data = data;
                deferredObject.resolve(results);
            }).
            error(function (data, status, headers, config) {
                results.isSuccess = false;
                results.message = 'Could not create Room item:';
                if (typeof data == "string") {
                    results.message = results.message + ' ' + data;
                } else {
                    for (var i = 0; i < data.length; i++) {
                        results.message = results.message + ' ' + data[i].Message;
                    }
                }
                deferredObject.resolve(results);
            });
        
        return deferredObject.promise;
    };
    
    this.updateRoom = function (id, number, buildingId) {
        var deferredObject = $q.defer();
        var results = {
            isSuccess: true,
            message: '',
            data: null
        }
        var itemData = {
            Id: id, 
            Number: number, 
            BuildingId: buildingId
        };
        
        $http.put('/api/rooms', itemData).
            success(function (data) {
                results.data = data;
                deferredObject.resolve(results);
            }).
            error(function (data, status, headers, config) {
                results.isSuccess = false;
                results.message = 'Could not update Room item:';
                if (typeof data == "string") {
                    results.message = results.message + ' ' + data;
                } else {
                    for (var i = 0; i < data.length; i++) {
                        results.message = results.message + ' ' + data[i].Message;
                    }
                }
                deferredObject.resolve(results);
            });
        
        return deferredObject.promise;
    };
    
    this.deleteRoom = function (id) {
        var deferredObject = $q.defer();
        var results = {
            isSuccess: true,
            message: '',
            data: null
        }
        
        $http.delete('/api/rooms/' + id).
            success(function (data) {
                results.data = data;
                deferredObject.resolve(results);
            }).
            error(function (data, status, headers, config) {
                results.isSuccess = false;
                results.message = 'Could not delete Room item:';
                if (typeof data == "string") {
                    results.message = results.message + ' ' + data;
                } else {
                    for (var i = 0; i < data.length; i++) {
                        results.message = results.message + ' ' + data[i].Message;
                    }
                }
                deferredObject.resolve(results);
            });
        
        return deferredObject.promise;
    };
}

RoomssService.$inject = ['$http', '$q'];
 

Admin tool UI

Following is a screen shot of the admin tool.  Go ahead and dig deeper, examining and running the application to manage your data.

Digging deeper with VITA modules

In the VITA basics section, we went through many of VITA’s core features.  In this example, we are going to go a little deeper with a larger data model, and utilize one of VITA’s additional modules:  Logging.  We will also illustrate some computed properties and views, and utilize Slim Api.  Follow along with the Northwind ExampleSolution download, which has the following projects:

  • Northwind.VITA – The VITA managed data model with additional modules, and related packaging.  This assembly also includes the Slim Api controllers.
  • Northwind.Tests – Some basic CRUD tests for exercising your VITA managed data model.  Create a test MS SQL database matching the DbConnectionString config value.
  • Northwind.DS – A library of web api services and related materials.
  • Northwind.UI – An MVC/AngularJS single page web application utilizing the web api services and additional modules.  Create an MS SQL database matching the DbConnectionString config value.

Northwind example

Northwind is a common sample database, and we want to use that familiar structure as our data model for this example.  Our desired database schema looks like the following:

Data model with additional attributes

In building real world applications, we need to be able to tailor our data model to meet exacting requirements for a variety of situations.  VITA does not disappoint in that regard.  Review the example download to view the complete data model for this Northwind case.  I will just show some excerpts to illustrate some features that enable you to tailor your data model.

Data model example

View the example data model interface for ICategory:

[Entity(Name="Category", TableName="Categories")]
    [Paged, OrderBy("CategoryName")]
    public partial interface ICategory
    {
        [Column("CategoryID"), PrimaryKey, ClusteredIndex(IndexName="PK_Categories"), Identity]
        int CategoryID { get; set; }
       
        [Column("CategoryName", Size = 15), Index(IndexName="CategoryName")]
        string CategoryName { get; set; }
       
        [Column("Description"), Nullable, Unlimited]
        string Description { get; set; }
       
        [OrderBy("ProductName")]
        IList<IProduct> Products { get; }
    }
 

Entity level attributes

VITA provides a number of optional attributes to tailor how your entities/tables are managed.  Some of these attributes include:

  • Entity – This is a required attribute, but you can optionally specify the name of the table and/or the name of the entity type.
  • Paged – For larger tables, use this attribute to trigger VITA to generate stored procedures with paging.
  • OrderBy – Use this attribute to trigger VITA to generate default ordering in listing stored procedures.

Property level attributes

VITA provides a number of optional attributes to tailor how your columns/properties are managed.  Some of these attributes include:

  • Column – You can tailor the name, Size and specific db data type (DbType or DbTypeSpec) for your property.
  • ClusteredIndex – For specifying that the property should be the clustered index, can specify the name of the index.  For smaller tables, use the entity level HeapTable attribute instead, which will also affect how VITA will cache data for this entity.  Use the Index attribute for non-clustered indexes.
  • Identity – For specifying an identity primary key property.  Our basic example used the Auto attribute for automatically generated Guid primary keys.
  • Nullable – For specifying that property is nullable.
  • Unlimited – For specifying no limit to data length.
  • OrderBy – For collections, you can specify a default ordering.

Data model example

View the example data model interface for IOrderDetail:

[Entity(Name="OrderDetail", TableName="Order Details")]
[PrimaryKey("Order,Product")]
[ClusteredIndex("Order,Product", IndexName="PK_Order_Details")]
[Paged]
public partial interface IOrderDetail
{
    [Column("UnitPrice", DbTypeSpec = "money", Scale = 4, Precision = 19)]
    decimal UnitPrice { get; set; }

    [Column("Quantity", DbType = DbType.Int16)]
    short Quantity { get; set; }

    [Column("Discount", DbTypeSpec = "real", Scale = 0, Precision = 24)]
    float Discount { get; set; }

    [EntityRef(KeyColumns = "OrderID")]
    IOrder Order { get; set; }

    [EntityRef(KeyColumns = "ProductID")]
    IProduct Product { get; set; }
}
 

Entity level attributes

Some additional attributes illustrated here include:

  • PrimaryKey – For composite primary keys, you can define the properties in the key at the entity level.
  • ClusteredIndex – For composite clustered indexes, you can define the properties in the index at the entity level.  Use the Index attribute for non-clustered indexes.

Property level attributes

Some additional attributes illustrated here include:

  • EntityRef – to explicitly specify foreign key column name(s). For example, for Orderproperty default would be Order_id.

Data model example

View the (consolidated) example data model interface for IEmployee:

  [Entity]
    public partial interface IEmployee
    {
        [PrimaryKey, Identity]
        int EmployeeID { get; set; }
       
        string LastName { get; set; }
        
        string FirstName { get; set; }
       
        [ManyToMany(typeof(IEmployeeTerritory))]
        IList<ITerritory> Territories { get; }
       
        [EntityRef(KeyColumns = "ReportsTo"), Nullable]
        IEmployee Employee { get; set; }
        
        [Computed(typeof(EmployeeHelper), "GetEmployeeFullName"), DependsOn("LastName,FirstName")]
        string FullName { get; }
    }
 

Many to many relationships

Some database tables formalize many-to-many relationships, and if these tables do not contain any useful information beyond mapping the related tables, it would be nice if we don’t have to deal with these “mapping” tables directly.

VITA provides direct support for these many-to-many relationships as illustrated in the ManyToMany attribute on the Territories property, where IEmployeeTerritoryis a mapping entity/table.  Now we can just think of what territories the employee belongs to, and VITA will automatically update the mapping table when you use Addor Removeon your (Territories) list.  See the EmployeeCRUDTestin the example download to see this in action.

Computed Properties and Views

Computed properties and views allow you to transform your data in many ways as needed for your applications.

Computed properties

Notice the read only computed property FullName in the IEmployee data model item above.  Setting up computed properties is a snap.  Specify the type and method that will perform the computation, and the properties involved in the computation.  Below is the EmployeeHelper.GetEmployeeFullName() method that performs the computation for the FullName property.

public static class EmployeeHelper
{
    public static string GetEmployeeFullName(IEmployee employee)
    {
        return employee.FirstName + " " + employee.LastName;
    }
}
 

Let's make use of this property in a test, see a portion of the EmployeeCRUDTest()test method below.  Note that we have access to employee2.FullName and get the expected first and last name.

[TestMethod]
public void EmployeeCRUDTest()
{
    IEntitySession session1 = DomainApp.OpenSession();
    IEntitySession session2;

    // create Employee
    IEmployee employee1 = EmployeeTest.GetTestEmployee(session1);
    employee1.FirstName = "John";
    employee1.LastName = "Doe";
    session1.SaveChanges();
    Assert.IsNotNull(employee1, "Create and save of IEmployee item failed.");

    // read Employee
    session2 = DomainApp.OpenSession();
    IEmployee employee2 = session2.GetEntity<IEmployee>(employee1.EmployeeID);
    Assert.IsNotNull(employee2, "Retrieval of new IEmployee item failed.");
    Assert.IsTrue(EmployeeTest.CompareItems(employee1, employee2), "Retrieved IEmployee item match with created item failed.");
    Assert.AreEqual(employee2.FullName, "John Doe");
}
 

Views

Setting up views is a little more work, but still an easy effort.  Views are set up as part of your EntityModule as below:

public class DomainModule: EntityModule
 {
     public DomainModule(EntityArea area) : base(area, "DomainModule")
     {
         RegisterEntities( typeof(ICustomer)
             , typeof(IProduct)
             , typeof(ICategory)
             , typeof(ICustomerCustomerDemo)
             , typeof(ICustomerDemographic)
             , typeof(IEmployee)
             , typeof(IEmployeeTerritory)
             , typeof(IOrderDetail)
             , typeof(IOrder)
             , typeof(IRegion)
             , typeof(IShipper)
             , typeof(ISupplier)
             , typeof(ITerritory));

         // IProductView setup
         var productQuery = from i in ViewHelper.EntitySet<IProduct>()
             select new
             {
                 ProductID = i.ProductID,
                 ProductName = i.ProductName,
                 QuantityPerUnit = i.QuantityPerUnit,
                 UnitPrice = i.UnitPrice,
                 UnitsInStock = i.UnitsInStock,
                 UnitsOnOrder = i.UnitsOnOrder,
                 ReorderLevel = i.ReorderLevel,
                 Discontinued = i.Discontinued,
                 CategoryName = i.Category.CategoryName,
                 CompanyName = i.Supplier.CompanyName,
                 ContactName = i.Supplier.ContactName,
                 ContactTitle = i.Supplier.ContactTitle,
             };
         RegisterView<IProductView>(productQuery, DbViewOptions.Materialized);
     }
 }
 

Here, we set up an IProductView view, using ViewHelper and essentially creating a flattened view of IProduct.  Then we register that view as part of our module using RegisterView.

Let's make use of this view in a test, see a portion of the ProductCRUDTest() test method below.  Note that we have access to the IProductView entity set, and get the expected product name from the view.

[TestMethod]
public void ProductCRUDTest()
{
    IEntitySession session1 = DomainApp.OpenSession();
    IEntitySession session2;

    // create Product
    IProduct product1 = ProductTest.GetTestProduct(session1);
    product1.ProductName = "My Product";
    session1.SaveChanges();
    Assert.IsNotNull(product1, "Create and save of IProduct item failed.");

    // read Product
    session2 = DomainApp.OpenSession();
    IProduct product2 = session2.GetEntity<IProduct>(product1.ProductID);
    Assert.IsNotNull(product2, "Retrieval of new IProduct item failed.");
    Assert.IsTrue(ProductTest.CompareItems(product1, product2), "Retrieved IProduct item match with created item failed.");

    // read Product View
    var productView = session2.EntitySet<IProductView>().Where(i => i.ProductID == product2.ProductID).FirstOrDefault();
    Assert.IsNotNull(productView);
    Assert.AreEqual(product2.ProductName, productView.ProductName);

}
 

Additional modules

We will utilize an additional VITA modules in this example, Logging.  A VITA module is a reusable component that brings additional functionality with supporting tables into your app.  Adding modules to your application is easy.  Below we added the logging entity application to our EntityApp:

 public class DomainApp: EntityApp
    {
        public DomainApp()
        {
            this.Version = "1.0.0.2";
            
            // add main area and module
            var domainArea = this.AddArea("Domain");
            MainModule = new DomainModule(domainArea);
            
            // add user transaction log, with extra tracking columns in "transaction" entities  
            var transLogStt = new TransactionLogModelExtender();
            // add columns CreatedIn and UpdatedIn - tracking user/date of create/update events
            transLogStt.AddUpdateStampColumns(new[]
            {
                      typeof(ICustomer)
                    , typeof(IProduct)
                    , typeof(ICategory)
                    , typeof(ICustomerCustomerDemo)
                    , typeof(ICustomerDemographic)
                    , typeof(IEmployee)
                    , typeof(IEmployeeTerritory)
                    , typeof(IOrderDetail)
                    , typeof(IOrder)
                    , typeof(IRegion)
                    , typeof(IShipper)
                    , typeof(ISupplier)
                    , typeof(ITerritory)
                },
                createIdPropertyName: "CreatedIn", updateIdPropertyName: "UpdatedIn");
            
            // Error log, operation log, web call log, model change log, incident log, persistent session
            this.LoggingApp = new LoggingEntityApp("log");
            LoggingApp.LinkTo(this);
    }
 

Logging modules

The logging module is a powerful building block for managing real world applications.  The overall Logging module includes several modules that you can pick and choose from, and we configured all of them in this example to be part of the “Log” area.  The following subsections describe configuring and using each of these logging modules.

Transaction log

Use this module to log database update transactions.  To configure this module, add a new TransactionLogModulewith some TransactionLogSettings.  Notice in our settings that we configured the settings to AddUpdateStampColumnsto each of our entities.

What does this do?  Notice that all of the tables now have a CreatedInand UpdatedIncolumn.  When a record is created or updated, these columns are filled with the id of the corresponding database update transaction.  Tying update operations to overall transactions is a much more effective means for tracking updates that usually do involve updates to multiple records in one or more tables.  It is very common pattern to add tracking columns to tables like CreatedDateTime, UpdatedDateTime, CreatedBy, UpdatedBy.  TransactionLog allows doing the same, but using a reference to TransactionId, and the transaction record has reference to date time and current user.

Also, the following table is added (in your configured schema area) to your database:

  • TransactionLog – Each record logs details of an update transaction that occurred, including change date and details and links to user and web call information.  Each transaction record contains a list of ALL entities (PKs) of all records touched by the transaction. This log can be used when syncing databases – to find out which records were changed in db since last sync.

Operation log

Use this module to log database operations of any kind.  To configure this module, add a new OperationLogModule.  This adds the following table (in your configured schema area) to your database:

  • OperationLog – Each record logs details of a select or procedure call that was made, including parameters, date, and user info.  This log can be turned on/off on the fly, or for specific users, for detailed debugging of issues on production sites.

Error log

Use this module to log errors that may occur, whether.  To configure this module, add a new ErrorLogModule.  This adds the following table (in your configured schema area) to your database:

  • ErrorLog – Each record logs details of an exception that occurred, including date, source, message details, and links to web call and user info.

Web call log

Use this module to log web api calls.  To configure this module, add a new WebCallModule.  This adds the following table (in your configured schema area) to your database:

  • WebCallLog – Each record logs details of a web api call including date, user and location information, request and response information, and error details if errors occurred.  For an error, the system also logs all SQL calls that were made during processing automatically.

Incident log

Use this module to log general incidents.  To configure this module, add a new IncidentLogModule.  This module adds the following tables (in your configured schema area) to your database:

  • IncidentLog – Each record shows details of login failures and disabled logins after N failed attempts.
  • IncidentAlert

Slim Api

If you just need to package up a set of RESTful web services for your own applications, Slim Api is definitely a great streamlined way to build these services.  Build slim api controllers in any library without the overhead of ASP.Net WebApi libraries.  The setup for a SlimApiControlleris very similar to a normal web api one, the primary differents being the use of Api attrributes.  Below are the method signatures for CategoriesController, found in the VITA.CodeFirst assembly:

public class CategoriesController : SlimApiController
{
    [ApiGet, ApiRoute("categories")]
    public QueryResults<CategoryDto> GetCategories([FromUrl] CategoryQuery query)
    {
    }

    [ApiGet, ApiRoute("categorieslist")]
    public QueryResults<CategoryDto> GetCategoriesList([FromUrl] int take = 100, int categoryID = 0)
    {
    }

    [ApiGet, ApiRoute("categories/{categoryid}")]
    public CategoryDto GetCategory(int categoryID)
    {
    }

    [ApiPost, ApiRoute("categories")]
    public CategoryDto CreateCategory(CategoryDto item)
    {
    }

    [ApiPut, ApiRoute("categories")]
    public CategoryDto UpdateCategory(CategoryDto item)
    {
    }

    [ApiDelete, ApiRoute("categories/{categoryid}")]
    public void DeleteCategory(int categoryID)
    {
    }
}
 

To utilize these controllers, register them in your EntityAppwith the global route prefix:

public DomainApp()
{
    this.Version = "1.0.0.2";

    // add main area and module
    var domainArea = this.AddArea("Domain");
    MainModule = new DomainModule(domainArea);

   // other enity application setup here...

    //api config
    base.ApiConfiguration.GlobalRoutePrefix = "slimapi";
    base.ApiConfiguration.RegisterControllerTypes(
        typeof(CustomersController),
        typeof(ProductsController),
        typeof(CategoriesController),
        typeof(CustomerCustomerDemosController),
        typeof(CustomerDemographicsController),
        typeof(EmployeesController),
        typeof(EmployeeTerritoriesController),
        typeof(OrderDetailsController),
        typeof(OrdersController),
        typeof(RegionsController),
        typeof(ShippersController),
        typeof(SuppliersController),
        typeof(TerritoriesController),
        typeof(ClientErrorController),
        typeof(LoggingDataController));
}
 

In any of these examples, you can test both the "normal" web api and slim api controllers.  In this example, compare /api/categories to /slimapi/categories.

Testing and admin tool

Go ahead and dig deeper, running the tests and running the admin tool application to manage your data, and review your info and logging related information to see these modules in action.

Diiging deeper with VITA authorization

In the previous sections, we went through many of VITA’s core features and modules.  In this example, we are going to go deeper utilizing the Authorization framework and the supporting module Login.  Follow along with the Forums Example Solution download, which has the following projects:

  • Forums.VITA – The VITA managed data model with additional modules, and related packaging.
  • Forums.Tests – Some basic CRUD tests for exercising your VITA managed data model.  Create a test MS SQL database matching the DbConnectionString config value.
  • Forums.DS – A library of web api services and related materials.
  • Forums.UI– An MVC/AngularJS single page web application utilizing the web api services, additional modules, and authorization framework.  Create an MS SQL database matching the DbConnectionString config value.

Forums example

Our final example is a forums based model, where there are a number of kinds of posts such as discussions, issues, and comments.  Our desired database schema looks like the following:

Data model

In this example, we want to implement a common post table with the same primary key as specific tables, and for grins, implement a different table naming convention.  No problem with VITA!  Review the data model for IDiscussionand the common IPost:

 [Entity(Name="Discussion", TableName="tblForums_Discussion")]
   [Paged, OrderBy("Title")]
    public partial interface IDiscussion
    {
        [PrimaryKey, EntityRef(KeyColumns = "PostID")]
        IPost Post { get; set; }
        
        [Column("Title", Size = 255), Index(IndexName="IX_Discussion_Title")]
        string Title { get; set; }
        
        [Column("DiscussionText"), Unlimited]
        string DiscussionText { get; set; }
        
        IList<IDiscussionReply> DiscussionReplies { get; }
    }

    [Entity(Name="Post", TableName="tblForums_Post")]
    [Paged, OrderBy("IntroText")]
    public partial interface IPost
    {
        [Column("PostID"), PrimaryKey, ClusteredIndex(IndexName="PK_tblForums_Post"), Auto]
        Guid PostID { get; set; }
       
        [Column("IntroText", Size = 1000), Nullable]
        string IntroText { get; set; }
       
        IList<IVote> Votes { get; }
       
        [ManyToMany(typeof(IPostTag))]
        IList<ITag> Tags { get; }
       
        [OrderBy("CommentText")]
        [OneToMany("CommentOnPost")]
        IList<IComment> CommentOnComments { get; }
       
        [EntityRef(KeyColumns = "MemberID")]
        IMember Member { get; set; }
    }
 

Another brilliance of the VITA interface approach to managing your data model is that your focus is only on the data model, and not on higher level design constructs.  With this forums example, you would naturally think of the information in terms of generalized and specialized information (Discussionis a type of Post, Commentis also a type of Post, etc.), and the overall system may reflect that.  VITA supports any type of simple or complex data model that you might need.  But, VITA wisely doesn’t support the notion of generalization/specialization in the data model, something that is not a directly supported relational database construct.  You define the data model exactly as you need it to support your requirements and best practices, and you translate data into the higher level interfaces/classes as you need to.  In this example, I implemented the data model as a classic TPT pattern, and merely flattened the generalized/specialized data in the dto classes (Discussiondto includes base Postinformation, etc.).

Additional modules

Before we get into authorization, we will utilize an additional VITA modules in this example, Login.  Below we add this module to our EntityApp (with the associated required logging application):

 public class DomainApp: EntityApp
    {
        public DomainApp(string cryptoKey) : this()
        {
            var cryptoService = this.GetService<IEncryptionService>();
            var cryptoBytes = HexUtil.HexToByteArray(cryptoKey);
            if (cryptoService != null) cryptoService.AddChannel(cryptoBytes); //sets up default unnamed channel
        }

        public DomainApp()
        {
            var domainArea = this.AddArea("Domain");
            var mainModule = new DomainModule(domainArea);
           
            var loginArea = this.AddArea("Login");
           
            // add login functionality
            var loginStt = new LoginModuleSettings(passwordExpirationPeriod: TimeSpan.FromDays(180)); //uses BCrypt hashing
            loginStt.RequiredPasswordStrength = PasswordStrength.Medium; 
            // var loginStt = new LoginModuleSettings(passwordHasher: new Pbkdf2PasswordHasher()); // uses Pbkdf2 hasher - inferior method
            var loginModule = new LoginModule(loginArea, loginStt);
            //EncryptedData is used by login module
            var cryptModule = new EncryptedDataModule(loginArea);
            var templateModule = new TemplateModule(domainArea);
            
             this.LoggingApp = new LoggingEntityApp("log");
            LoggingApp.LinkTo(this);

           // add trigger to suspense login after 3 failed attempts within a minute
            var loginFailedTrigger = new Vita.Modules.Login.LoginFailedTrigger(this,
                failureCount: 3, timeWindow: TimeSpan.FromMinutes(1), suspensionPeriod: TimeSpan.FromMinutes(5));
            LoggingApp.IncidentLog.AddTrigger(loginFailedTrigger);
        }
    }
 

Login

Use this module to enable login functionality.  To configure this module, add a new LoginModule, with LoginModuleSettings.  Also note that you can define a trigger for login failures with timeout and ability to tie with your incident log (to log login failures).  This module adds the following tables (in your configured schema area) to your database:

  • Login– Each record records key information for a login account, including username and password info, date, and status.
  • SecretQuestion– Each record contains a question that can be used for account recovery.
  • SecretAnswer– Each record records a user’s answer to a secret question for account recovery.
  • TrustedDevice– Each record defines a trusted device (I haven’t used this feature).
  • UserSession – Each record records key information about each persistent session, including user, token, and expiration information.
  • UserSessionLastActive – Each record records the date when a given user session was last active.

How do you make use of the login feature?  Typically, you will want to tie login to a user entity in your data model.  In our case, our user entity is IMember:

[Entity]
public partial interface IMember
{
    [PrimaryKey, Auto]
    Guid MemberID { get; set; }

    string DisplayName { get; set; }

    string FirstName { get; set; }

    string LastName { get; set; }

    string EmailAddress { get; set; }

    UserType Type { get; set; } //might be combination of several type flags

    IList<IVote> Votes { get; }

    IList<IPost> Posts { get; }
}
 

Note that we added a UserTypeenum property to this entity.  We will make use of this in the authorization framework.

For application use, we need a complete login and registration process, and VITA provides services that we can readily use for this.  Below, we tie login with the IMember user entity during registration (see AuthenticationControllerin the Forums.DSproject for more details):

[
HttpPost, Route("api/auth/register")]
public LoginResponseDto Register(RegisterDto registerDto)
{
    // create user
    var session = OpenSession();
    IMember member = session.NewEntity<IMember>();
    member.DisplayName = registerDto.DisplayName;
    member.FirstName = registerDto.FirstName;
    member.LastName = registerDto.LastName;
    member.EmailAddress = registerDto.EmailAddress;
    member.Type = UserType.Member;
    if (registerDto.IsAdmin)
        member.Type |= UserType.Administrator;
    session.SaveChanges();

    // create login
    var login = _loginManagementService.NewLogin(session, registerDto.UserName, registerDto.Password, userId: member.MemberID, loginId: member.MemberID);
    session.SaveChanges();

    // login
    LoginDto loginDto = new LoginDto { UserName = registerDto.UserName, Password = registerDto.Password };
    return Login(loginDto);
}
 

At the create login step, we tie the login record with our member record.  Now we can log in and create our persistent sessions with our authentication token:

[HttpPost, Route("api/auth/login")]
public LoginResponseDto Login(LoginDto loginDto)
{
    //Login using LoginService
    var loginResult = _loginService.Login(loginDto.UserName, loginDto.Password);
    if(!loginResult.Success)
        return new LoginResponseDto() { ResultCode = "LoginFailed" };

    OpContext.User = loginResult.User;
    _sessionService.StartSession(OpContext);
    var userSession = OpContext.UserSession;
    var resp = new LoginResponseDto() {ResultCode = "Success", AuthenticationToken = userSession.Token };
    return resp;
}
 

And of course we can logout to end our session:

[HttpDelete, Route("api/auth/login"), AuthenticatedOnly]
public void Logout()
{
    _loginService.Logout(OpContext.User);
    var userSession = OpContext.UserSession;
    if(userSession != null)
    {
        _sessionService.EndSession(OpContext);
    }
}
 

Authorization framework

Of all of VITA’s additional features, I think the Authorization framework really stands out over other frameworks out there.  With this framework, you can easily define rules using entity resources, filters, permissions, and activities to precisely determine what a user can do, even down to the property level.  We are going to go through a basic scenario here.

Authorization roles and rules

For our forums application, we want 3 types of users with the following rules:

  • Public – A non-logged in user that can view anything.
  • Member – A logged in user that can view anything and create/edit/delete their own posts.
  • Admin – A logged in user that can view and create/edit/delete anything.

Following is the core of the DomainAuthorizationHelperclass that defines these roles and rules:

 public static class DomainAuthorizationHelper
    {
        public static void EnsureInitialized()
        {
           var memberDataFilter = new AuthorizationFilter("MemberData");
            memberDataFilter.Add<IMember, Guid>((i, userId) => i.MemberID == userId);
            memberDataFilter.Add<IComment, Guid>((i, userId) => i.Post.Member.MemberID == userId);
            memberDataFilter.Add<IDiscussion, Guid>((i, userId) => i.Post.Member.MemberID == userId);
            memberDataFilter.Add<IDiscussionReply, Guid>((i, userId) => i.Post.Member.MemberID == userId);
            memberDataFilter.Add<IIssue, Guid>((i, userId) => i.Post.Member.MemberID == userId);
            memberDataFilter.Add<IIssueReply, Guid>((i, userId) => i.Post.Member.MemberID == userId);
            memberDataFilter.Add<IPost, Guid>((i, userId) => i.Member.MemberID == userId);
            memberDataFilter.Add<IPostTag, Guid>((i, userId) => i.Post.Member.MemberID == userId);
            memberDataFilter.Add<IVote, Guid>((i, userId) => i.Member.MemberID == userId);
            
            // Entity resources
            var entities = new EntityGroupResource("Entities"
                , typeof(IComment)
                , typeof(IDiscussion)
                , typeof(IDiscussionReply)
                , typeof(IIssue)
                , typeof(IIssueReply)
                , typeof(IPost)
                , typeof(IIssueStatus)
                , typeof(ITag)
                , typeof(IPostTag)
                , typeof(IVote));
            var members = new EntityGroupResource("Members", typeof(IMember));
            
            // Permissions
            var browseAll = new EntityGroupPermission("BrowseAll", AccessType.Read, entities, members);
            var register = new EntityGroupPermission("Register", AccessType.Create, members);
            var manageAccount = new EntityGroupPermission("ManageAccount", AccessType.CRUD, members);
            var manageEntities = new EntityGroupPermission("ManageEntities", AccessType.CRUD, entities);
            
            // Activities
            var browsing = new Activity("Browsing", browseAll);
            var registering = new Activity("Registering", register);
            var editing = new Activity("Editing", manageAccount, manageEntities);
            
            // Roles
            // Public role can browse through anything and register
            PublicUser = new Role("PublicUser", browsing, registering);
            // Member role can browse and edit own stuff
            MemberUser = new Role("MemberUser");
            MemberUser.ChildRoles.Add(PublicUser);
            MemberUser.Grant(memberDataFilter, editing);
            // Admin role can browse and edit anything
            AdminUser = new Role("AdminUser", editing);
            AdminUser.ChildRoles.Add(MemberUser);
            AdminUser.ChildRoles.Add(PublicUser);
        }
    }
 

We configured the roles and rules above as follows:

  • Data Filter – A data filter is an object that answers a simple question – is this entity X connected to user Y?  If yes, the associated permission is enabled.  We set up a memberDataFilterto filter all of the types of posts and the member entity by the currently logged in member.  We get the current user id from UserIdReader().
  • Entity Resources – We define 2 groups of entity resources, members for the IMemberentity, and entities for everything else.  We want to be able to let a public user create an IMember record during registration.
  • Permissions – We define 4 permissions, browseAll to be able to read entitesand membersresources, registerto be able to create membersresource, manageAccountto be able to edit membersresource, and manageEntitiesto be able to edit entitiesresources.
  • Activities – We define 3 activities, browsingto utilize browseAllpermissions, registeringto utilize registerpermissions, and editingto utilize manageAccountand manageEntitiespermissions.
  • Roles – Our roles are defined as:
    • PublicUser – A public user can perform the browsingactivity to view anything, and can perform the registeringactivity to register.
    • MemberUser – A member can perform public user activities, and can perform the editingactivity for items that pass the memberDataFilter.
    • AdminUser – An admin can perform public and member user activities, and can perform the editingactivity for all items.

Make sense?

To wire up these roles and rules, we need to override GetUserRolesin our EntityApp:

public override IList<Role> GetUserRoles(UserInfo user)
{
    DomainAuthorizationHelper.EnsureInitialized();
    var list = new List<Role>();
    switch(user.Kind)
    {
        case UserKind.Anonymous:
            list.Add(DomainAuthorizationHelper.PublicUser);
            return list;
        case UserKind.AuthenticatedUser:
            var session = this.OpenSystemSession();
            var iUser = session.GetEntity<IMember>(user.UserId);
            var roles = DomainAuthorizationHelper.GetRoles(iUser.Type);
            return roles;
    }
    return new List<Role>();
}
 

Utilizing authorization

Now let’s take another look at a couple of web api controller methods for IDiscussion(see DiscussionsControllerin Forums.DS project):

 [HttpGet, Route("api/discussions/{postid}")]
        public DiscussionDto GetDiscussion(Guid postID)
        {
            var session = OpenSecureSession();
            var item = session.GetEntity<IDiscussion>(postID);
            if (item == null)
            {
                WebContext.CustomResponseStatus = HttpStatusCode.BadRequest;
                WebContext.ResponseBody = String.Format("Discussion with ID '{0}' not found.", postID);
                return null; 
            }
            DiscussionDto itemDto = item.ToDto(true);
            Type[] blockingEntities;
            itemDto.CanDelete = itemDto.CanDelete && session.CanDeleteEntity<IDiscussion>(item, out blockingEntities);
            return itemDto;
        }

        [HttpPost, Route("api/discussions"), AuthenticatedOnly]
        public DiscussionDto CreateDiscussion(DiscussionDto item)
        {
            return CreateUpdateDiscussion(item, create: true);
        }
 

Now that we have our framework in place, we can make direct use of a couple of things:

  • Secure sessions – Now we can use OpenSecureSession()to open a secure session for both logged in and public users.  A secure session is a session associated with particular user (current logged in user), and all data operations are verified against user permissions; it enables VITA’s entity access authorization, so that we can make use of our authorization roles and rules.
  • Authenticate requests – Now we can use the AuthenticatedOnlyattribute to allow requests for only logged in users.  If a user is not authenticated, VITA will throw an authenticated required exception, resulting in a BadRequest response.

Now, let’s take a look at the IDiscussiondto extension class:

public static class DiscussionDtoExtensions
{
     public static DiscussionDto ToDto(this IDiscussion discussion)
     {
         var discussionDto = new DiscussionDto()
         {
             Title = discussion.Title,
             DiscussionText = discussion.DiscussionText,
             PostID = discussion.Post.PostID,
             IntroText = discussion.Post.IntroText,
             CanEdit = true,
             CanDelete = true
         };
         var permissions = EntityHelper.GetEntityAccess(discussion);
         discussionDto.CanEdit = permissions.CanUpdate();
         discussionDto.CanDelete = permissions.CanDelete();
         return discussionDto;
     }
 }
 

Here we use VITA’s EntityHelperto GetEntityAccesspermissions for the discussion item.  We can peruse these permissions to check things such as if the user can peek/read/update/delete the corresponding data and do something based on those permissions.  We make use of whether the user can update or delete the data as CanEditand CanDeleteproperties for the dto object, so that the UI bind to this and provide or hide functions based on permission.

Wow, how easy it is to use such a powerful authorization framework!

AngularJS application with login and authentication scenarios

The best way to get a feel for how the authorization framework works is to play around with the example admin tool application.  Register both as normal user and as an admin user (you have the power to make yourself an administrator!).  View/edit information while logged in (and out) with different user types.

Following is a view of discussions as seen by a public (not logged in) user, with only access to view information:

Following is the same view for Bob, a non admin member.  Bob can edit his own discussions and create new ones:

Following is the same view for an admin user.  The admin user can create and edit anything:

Note that if Bob (non-admin user) tries to play dirty and fabricates an update service call to update an item he is not allowed to update (using Fiddler for example), then VITA authorization would intercept and throw AccessDenied exception, effectively canceling the operation.

Using VITA for real projects

This article includes some example web api applications using VITA.  But what about using VITA to build large, scalable enterprise applications in the real world?

Features

In terms of core ORM features, general application support, and in particular web api support, I think VITA stands in the top tier of the pack of available frameworks out there.  As mentioned previously, VITA particularly encourages building scalable, loosely coupled architectures.  I would have no concerns at all from a feature standpoint with using VITA on a real enterprise project.

Open source and level of support

One concern enterprise businesses often have in adopting an open source solution is in the arena of support.  How active is the project?  How many installations are out there?  How responsive is support?

The VITA open source project is very active and is unlikely to become a dead end in for foreseeable future.  A 1.1 release with updated nuget packages came out recently.  I do know that a number of additional features are in the works.  I am comfortable that VITA is not going away.

How many installations are out there and how many users of VITA?   I believe VITA is actively used in production and development of several applications running in the cloud already, and from Roman I understand VITA is running on several production servers in the cloud and is even being utilized in the International Space Station (ISS).

How many contributing members to the project are providing support?  No additional developers are listed on the site, and if I can get more data on that I will provide some info.  But I can say from experience in the Irony project that Roman is long term focused and very responsive to issues and discussions and making sure they get resolved.  I expect no less on the VITA project.

Managing production data

Being able to maintain the integrity of your production data and having the ability to apply any schema or data updates you need to is critical in any production environment.  The current approach in production environments right now is to not have VITA automatically update your schema.  Instead, use VITA's vdbtool to generate the DDL scripts for the database changes.  You edit the scripts to make any necessary changes you need to before applying the changes to production.   I plan on running some tests on this and update the article here with a couple of scenarios.

Continuing on with VITA

Below are some additional resources to help you going forward to utilize VITA and/or learn more about VITA features and capabilities.

Downloads

The easiest way to start using VITA in your .net applications is to install VITA nuget packages.  The available packages include:

  • VITA – This package includes the core ORM libraries and MS SQL Server driver.
  • VITA.Modules – TIncludes modules such as logging and login (encrypted data and party have not been covered here).
  • Vita.Web – This package includes integration of the ORM functionality with the web api stack.
  • Vita.Data.xxx – These packages are for supporting additional databases such as MS SQL CE, My SQL, PostreSql, and SQLite.

You can also download VITA from the VITA github site.

Documentation

The VITA github site is the best source for documentation on VITA’s features and capabilities, and current issues.  The VITA codeplex site may also be useful for legacy discussions and issues.

Examples

In addition to utilizing the examples in this article, if you download the source from the VITA github site, you can review and try out the BookStore example which utilizes some features in greater detail than the ones in this article.

VITA db tool

The source download at the VITA github site also includes a db tool (code generator) that lets you initially generate your data model from an existing database.  This provides you with a good starting point for ongoing work, especially when you have a legacy database to start with.

Mo+ templates

The sample applications were generated using Mo+.  You can use the templates found in the Mo+ VITA Templates download to generate overall application layers to your database/model as a great starting point for ongoing work.  You can use a source database for your Mo+ model, and thus do a "db first" approach with your overall application.  In any case, Mo+ will make the proper ongoing updates to all of your code based on changes you make to your Mo+ model.  Follow the instructions in the readme file in that download for instructions on getting started to utilize the templates.

In conclusion

I hope this article has given you a good look at the VITA ORM and .NET application framework, and has inspired you to look deeper into utilizing this framework.  Please post any thoughts you have below about VITA and/or the article or application downloads in general.  I for one can’t wait to use VITA on future projects!

LINK: https://www.codeproject.com/Articles/879568/VITA-A-Powerful-and-Flexible-ORM-and-Additional-Bu

Package of the week: BenchmarkDotNet

The week in .NET – On .NET on Docker and new Core tooling, Benchmark.NET, Magicka | .NET Blog

http://benchmarkdotnet.org/

When done properly, benchmarking is a great way to guide your engineering choices by comparing multiple solutions to a problem known to cause performance bottlenecks in your applications. There’s a lot of methodology involved if you want to do it right, however, that is both tricky and repetitive. And no, surrounding your code with a Stopwatch won’t cut it.

BenchmarkDotNet makes it very easy to decorate the code that you want to test so it can be discovered, run many times, and measured. BenchmarkDotNet takes care of warmup and cooldown periods as needed, and will compute mean running times and standard deviation for you. It can also generate reports in a variety of formats.

https://github.com/dotnet/BenchmarkDotNet

[Benchmark(Description = "ImageSharp Resize")]
public ImageSharpSize ResizeImageSharp()
{
  ImageSharpImage image = new ImageSharpImage(Width, Height);
  image.Resize(ResizedWidth, ResizedHeight);
  return new ImageSharpSize(ResizedWidth, ResizedHeight);
}



 

 

LINK: https://blogs.msdn.microsoft.com/dotnet/2017/02/07/the-week-in-net-on-net-on-docker-and-new-core-tooling-benchmark-net-magicka/

Polyglot Persistence Using DDD and Repository Unit of Work

Working with different data stores (SQL, NoSQL ..) in high performance enterprise application using DDD and Repository unit of work pattern

Introduction

Traditional business applications usually use a relational database to store data and use this database as an integration point. That entails designing sets of relational tables and accessing them with a single data access layer in code, as well as using an ORM to convert relational tables to an OOP structure .

Nowadays, that might not be optmized for each business use case scenario, considering some challenges encountered with relational database design. It's unnecessary here to list all the disadvantages of such design we can find a more articles about. But let's give a hint:

Agility: Mismatch between data structure in the database and the application object model has an adverse productivity.

Performance:Relational models were designed to take minimum disk space and to consume less resources, which has some side effects and performance limitations.

Availability and price:Increasing availability in relational model complicates the consistency and has its price.

Flexibility: Not all developers are familiar with relational models and schemas, in some situations, complex business models require a thorough knowledge of SQL language and relational models.

In a number of enterprise applications that capture a huge amount of data and numbers of concurrent requests, the relational model did not provide the requirements. So looking for other alternatives to solve a specific set of persistence problems leads many designers to select a non-relational systems referred to as NoSQL databases.

Non-relational systems or NoSQL databases can be categorized by a set of functional areas: Key/Value, document databases, column family databases, graph databases. Each category can fit one or more business use case scenarios and it's for a designer to choose the most appropriate system. Some NoSQL databases use filesystem for persistence while some of them are in memory.

Background

The advent of various NoSQL databases and the abundance of cheap storage spaces (disk and memory) will be a momentum for designers and developers to switch to non-relational models.

However, an obvious question will occupy the mind. How to design an application so that it will dialog with different data stores?

What Are We Going To Achieve At The End?

We are going to design and build a simple application using DDD and Repository Unit of Work pattern. This application will persist data in relational (SQL) and non relational (NoSQL) data stores.

As a real world example, we will use an enterprise product store, let's say an enterprise (customer) with many organizations, departments and employees managing a huge amount of products and each employee can search, list, add, update and remove products. Also, the application needs to store the users' traffic to help administrators to address possible performance issues. To meet the requirement, the application will be deployed on the cloud (Windows Azure) and on-premises. Finally, for demos and testing purposes, the application will store all the data in memory.

Selecting Databases

For our example, products store application we can recognize different types of data requirements. For each requirement, we will use a specific database type as follows:

  • Customers, organizations, departments and users necessitate a safe reliable database to keep the hierarchical structure safe. And those entities don't have very frequent changes. So, we will persist them in relational database, let's say SQL Server.
  • The application is supposed to hold a large number of products, and it has to filter, list and search products in acceptable performance.The information held in each product may be different. So, for those reasons, we will persist products in NoSQL document database. Here, we will use MongoDB.
  • The application will allow the users to take some notes and share them with other users. Notes are just text that can be listed, searched and filtered. For Note, text will be stored in Lucene index.
  • The application will store products images in cloud blob storage as key/ValueNoSQL database.
  • The application will log errors and store change history in cloud table storage as key/Value NoSQL database.

Designing and Creating a Solution

Before we start coding, let's give a brief high level design structure to our application.

Here is the layered diagram of our solution:

By the diagram here above, we can examine the principal layers used in the application:

  • Presentation: ASP.NET MVC5 application with HTML5 AngularJS
  • Distributed Services: MVC 5 Web API application exposes REST services
  • Application management: Manage the business rules
  • Domain Objects: Core Application business definitions, Interfaces, entities, aggregates
  • Data: Repositories and a specific unit of work implementations. Repository for each domain entity or aggregate and Unit of Work for each data store type.
  • Data stores: At the bottom of the figure above, we see the different data stores that we will use in application SQL server, MongoDB, Azure Storage, Lucene index and in-memory fake store

To examine the interactions between components, let's examine the component diagram below:

Presentation: In the presentation layer, we can see that ASP MVC 5 project will serve as SPA with HTML5 angularJS. That can also consume a direct Web API services in the distributed services using HTTP/HTTPS

DistributedServices: Web API exposing Rest Services over HTTP/HTTPS

Application Management: Class library used as .NET reference for Web API and MVC application.

Domain Objects: Class library used as .NET reference for application management and infrastructure data

Infrastructure data: Class library referencing Domain object and some libraries as connectors for data stores

  • Entity framework 6 for SQL Server
  • Mongocsharpdriver for mongoDB
  • Lucene.net for Lucene index
  • Windows Azure graph API for Azure storage

Now let's zoom on the Domain objects layer and see the entities definitions in this map diagram:

You can find this diagram in the attached solution.

Working with the Code

Domain Entities

After the general design description of our application example, let's delve into some lines of code. Mainly focus on the domain objects definitions and the infrastructure data layer. For the other layers, you can download the attached example.

Starting with domain objects definitions: Here, we will first define a base class for all entities witch contains common properties decorated with some libraries attributes. Here is the code of our base class.

/// <summary>
/// Base class for entities
/// </summary>
public abstract class EntityBase
{
    #region Members
      Guid _Id;
    #endregion
   #region Properties

     /// <summary>
    /// Get or set the persisted object identifier
    /// </summary>
    [Key]
    [Field(Key = true)]
    [BsonId]
    public virtual Guid Id
    {
        get
        {
            return _Id;
        }
        set
        {
            _Id = value;
        }
    }
    /// <summary>
    /// Get or set the Date of Creation
    /// </summary>
    [DataType(DataType.DateTime)]
    public DateTime CreationDate { get; set; }
    /// <summary>
    /// Get or set the Date of LastUpdate
    /// </summary>
    [DataType(DataType.DateTime)]
    public DateTime LastUpdateDate { get; set; }
    #endregion
 }
 

This class defines the Idproperty decorated by some attributes:

  • Key: Attribute for entity framework so that the property will be mapped to the SQL primary key
  • BsonI: Attribute for mongoDB so the Id property will be mapped as document identifier
  • Filed(Key=true): This for Lucene.net to be used as document identifier on the index

Now this is for entities. Let’s see how to define a contract to our repositories. Here is the definition:

/// <summary>
    /// Base interface for implement a "Repository Pattern", for
    /// </summary>
    /// <remarks>
    /// </remarks>
    /// <typeparam name="TEntity">Type of entity for this repository </typeparam>
    public interface IRepository<TEntity> 
        where TEntity : EntityBase
      {
        /// <summary>
        /// Get the unit of work in this repository
        /// </summary>
        IUnitOfWork UnitOfWork { get; }
       
         /// <summary>
        /// Add item into repository
        /// </summary>
        /// <param name="item">Item to add to repository</param>
        Task AddAsync(TEntity item);

        /// <summary>
        /// Delete item 
        /// </summary>
        /// <param name="item">Item to delete</param>
        Task RemoveAsync(TEntity item);

        /// <summary>
        /// Get element by entity key
        /// </summary>
        /// <param name="id">entity key values, the order the are same of order in mapping.</param>
        /// <returns></returns>
        Task<TEntity> GetElementByIdAsync(Guid id,
                                       CancellationToken cancellationToken = default(CancellationToken));

        /// <summary>
        /// Get all elements of type {TEntity} in repository
        /// </summary>
        /// <param name="pageIndex">Page index</param>
        /// <param name="pageCount">Number of elements in each page</param>
        /// <param name="orderBy">Order by expression for this query</param>
        /// <param name="ascending">Specify if order is ascending</param>
        /// <returns>List of selected elements</returns>
        Task<IEnumerable<TEntity>> GetPagedElementsAsync<T>(int pageIndex, int pageCount,
                             Expression<Func<TEntity, T>> orderBy, bool ascending,
                             CancellationToken cancellationToken = default(CancellationToken));
}
 

All entities and aggregates used in the application will define a repository class that implement the contract above as we can see the repository define its unit of work that will be injected at run time. So for each store, we will define its unit of work and inject it to the repository. Here is a contract for unit of work.

public interface IUnitOfWork
        : IDisposable
    {
        /// <summary>
        /// Commit all changes made in a container.
        /// </summary>
        ///<remarks>
        /// If the entity have fixed properties and any optimistic concurrency problem exists,  
        /// then an exception is thrown
        ///</remarks>
        void Commit();
   
......
         /// <summary>
        /// Commit all changes made in a container Async.
        /// </summary>
        ///<remarks>
        /// If the entity have fixed properties and any optimistic concurrency problem exists,  
        /// then an exception is thrown
        ///</remarks>
        Task CommitAsync( CancellationToken cancellationToken = default(CancellationToken));
        
        ......

         /// <summary>
        /// Commit all changes made in  a container Async.
        /// </summary>
        ///<remarks>
        /// If the entity have fixed properties and any optimistic concurrency problem exists,
        /// then 'client changes' are refreshed - Client wins
        ///</remarks>
        Task CommitAndRefreshChangesAsync(CancellationToken cancellationToken = default(CancellationToken));
 

Accessing Data Stores

For each data store, we will define its unit of work so we can then inject it in a specific repository. First, in domain object layer, we have defined a contract for unit of work so the unit of work here will implement it.

Starting with relational database here SQL server, maybe it's the most usual. Here, we will use Entity framework and create a unit of work that implements a base contract and inherits entity framework DBContext. Above is the code used in this example.

public class MainBCUnitOfWork : DbContext, IMainBCUnitOfWork 
    { 
        #region Fileds
        private IDbSet<Customer> _customers;
        private IDbSet<Address> _addresses;
        private IDbSet<Department> _departments;
        private IDbSet<Organization> _organizations;
   ....... 
        
         #region Properties
        public IDbSet<Customer> Customers
        {
            get
            {
                if (this._customers == null)
                    this._customers = (IDbSet<Customer>)this.Set<Customer>();
                return this._customers;
            }
        }
 
        public IDbSet<Department> Departments
        {
            get
            {
                if (this._departments == null)
                    this._departments = (IDbSet<Department>)this.Set<Department>();
                return this._departments;
            }
        }

         ..............
         #endregion
.........
       public virtual IQueryable<TEntity> CreateSet<TEntity>() where TEntity : class,new()
        {
            return (IDbSet<TEntity>)this.Set<TEntity>();
        }

        public virtual void Commit()
        {
            try
            {
                this.SaveChanges();
            }
            catch (DbEntityValidationException ex)
            {
                throw this.GetDBValidationExptions(ex);
            }
        }
         ...............

       public async Task CommitAsync(CancellationToken cancellationToken = default(CancellationToken))
        {
            cancellationToken.ThrowIfCancellationRequested();
            try
            {
                await this.SaveChangesAsync(cancellationToken);
            }
            catch (DbEntityValidationException ex)
            {
                throw this.GetDBValidationExptions(ex);
            }
        }
 ........................
 }
 

The full definition of the class can be found in the attached code.

For each entity stored in SQL server database, an IDBSetproperty is defined. The create set method will be used to retrieve data.

By the same way, we will define a unit of work for mongoDB using mongoDB C# driver that you can add as Nuget package. Here is the code used:

 public class MongoUnitOfWork : IMongoUnitOfWork
    {
        #region Fields
        string _dbHostName;
        string _dbName;
        MongoDatabase _database;
       ......

         #region properties
        public string DbName
        {
            get {
                if (string.IsNullOrEmpty(this._dbName))
                {
                    this._dbName = "polyGlotDemo";
                }
                return _dbName;
            }
            set { _dbName = value; }
        }

       public string DbHostName
        {
            get {
                if (string.IsNullOrEmpty(this._dbHostName))
                {
                    this._dbHostName = "127.0.0.1";
                }
                return _dbHostName;
            }
            set { _dbHostName = value; }
        }

        public MongoDatabase Database
        {
            get { return _database; }
            set { _database = value; }
        }

        public IDbSet<DepartmentAggregate> Departments { get; set; }
        #endregion
        #region Ctor
        public MongoUnitOfWork()
        {
            var pack = new ConventionPack();
            pack.Add(new CamelCaseElementNameConvention());
            ConventionRegistry.Register("MongoUnitOfWorkPack", pack, (t) => true);
            string connectionString = "mongodb://" + t
            MongoClientSettings settings = MongoClientSettings.FromUrl(new MongoUrl(connectionString));
            settings.WriteConcern.Journal = true;
            var mongoClient = new MongoClient(settings);
            var mongoServer = mongoClient.GetServer();
           if (!mongoServer.DatabaseExists(this.DbName))
            {
                throw new MongoException(string.Format
                (CultureInfo.CurrentCulture, Messages.DatabaseDoesNotExist, this.DbName));
            }
             this.Database = mongoServer.GetDatabase(this.DbName);
            var coll = this.Database.GetCollection<
            			DepartmentAggregate>("DepartmentAggregate");
            //coll.RemoveAll();
            foreach (var dep in data.DepartmentAggregates)
            {
                if (!coll.AsQueryable().Any(x => x.Id == dep.Id))
                {
                    coll.Insert<DepartmentAggregate>(dep);
                }
            }

           this.Departments = new MemorySet<DepartmentAggregate>();
        }
        #endregion
   .........
}
       #endregion


The full definition of the class can be found in the attached code.

For the other stores and unit of work, you can browse the code. The logic is the same.

Running the Example

Here are some prerequisites to work with the attached example.

  • Download mongoDB from https://www.mongodb.org/downloads
    • Start mongo and create a database and collection.
  • Visual Studio ultimate if you would like open the modeling project
  • Azure SDK 2.5: Emulate Azure storage
  • Run Visual studio as administrator
  • Run Azure solution
  • Use Nuget to restore any missing package

Points of Interest

Application example managing multiple stores and multiple environments using DDD, Windows Azure, MongoDB, Lucene.net, ...

Hope this example will help and any comments are welcome.

LINK: https://www.codeproject.com/Articles/889978/Polyglot-Persistence-Using-DDD-and-Repository-Unit

 

 

Web API Architecture And Dependency Injection Best Practices


This article explains the Web API best practices for architecture and Dependency Injection, using Unity and other options available for Dependency Injection.

From last few years, Web API is becoming very popular and these days, a lot of projects are going on with Web API. Thousands of projects have been developed using Web API.

If someone is working on Web API, then its architecture and best practices are the most important things, which enable the developer to create one of the best applications. There are more than a hundred rules and recommendations for Web API best practices but in this article, I am going to explain only about the architecture and Dependency Injection.

Web API Architecture

Hence, before discussing about Dependency Injection, it would be better to discuss Web API architecture best practices because Dependency Injection is based on the architecture. Here, I am going to refer to one of my previous articles.

It is recommended to go through my previous article because I will be using source code and concepts of my previous article. You can download the source from the attachment section of my previous article.

Step 1

Create Layered Architecture

Generally, Web API has 3 layers, which are given below.

  1. Web API Layer
  2. Business Layer (Business Logic Layer)
  3. Repository Layer (Data Access Layer)

You can see that Business Layer interfaces do not have any dependency on repository layer interfaces. Thus, the repository layer interface will be referenced only in the classes of business layer.

Step 2

Add POCO (Plain Old CLR Object).

Step 3

Add repository layer project to implement Repository layer interfaces and Business layer project to implement Business layer interfaces.

In the Solution Explorer, it will look, as shown below.

Now, I am going to implement Dependency Injection, using Unity. Right click on API project and open NuGet Package Manager and search for unity.

Install the two packages, as shown in the preceding screenshot.

After installing Unity at Web API layer, it will add some files and codes, which are shown below.

UnityConfig.cs 

 
  1. public static class UnityConfig {  
  2.     public static void RegisterComponents() {  
  3.         var container = new UnityContainer();  
  4.         GlobalConfiguration.Configuration.DependencyResolver = new UnityDependencyResolver(container);  
  5.     }  
  6. }   

UnityResolver.cs

 

 
  1. public class UnityResolver: IDependencyResolver  
  2. {  
  3.     protected IUnityContainer container;  
  4.     public UnityResolver(IUnityContainer container) {…………………………}  
  5.     public IDependencyScope BeginScope() {…………………………}  
  6.     public void Dispose() {…………………………}  
  7.     public object GetService(Type serviceType) {…………………………}  
  8.     public IEnumerable < object > GetServices(Type serviceType) {…………………………}  
  9. }  

Add the code given below to the file.

WebApiConfig.cs

 

 
  1. public static void Register(HttpConfiguration config)  
  2. {  
  3.     var container = new UnityContainer();  
  4.     container.RegisterType < IUserRepository, UserRepository > (new HierarchicalLifetimeManager());  
  5.     container.RegisterType < IUserBusiness, UserBusiness > (new HierarchicalLifetimeManager());  
  6.     container.RegisterType < IBaseRepository < User > , BaseRepository < User >> (new HierarchicalLifetimeManager());  
  7.     config.DependencyResolver = new UnityResolver(container);  
  8. }  

I have seen many people add unity at 2 layers i.e. at Business layer and API layer. Write the codes given below.

API Layer

 

 
  1. container.RegisterType<IUserBusiness, UserBusiness>(new HierarchicalLifetimeManager());  

Business Layer

 

 
  1. container.RegisterType<IUserRepository, UserRepository>(new HierarchicalLifetimeManager());container.RegisterType<IBaseRepository<User>, BaseRepository<User>>(new HierarchicalLifetimeManager());  

I do not endorse the preceding approach due to multiple reasons. Now, I am going to compare pros and cons of 2 approaches.

Approach 1

API Layer

 

 
  1. container.RegisterType<IUserBusiness, UserBusiness>(new HierarchicalLifetimeManager());container.RegisterType<IUserRepository, UserRepository>(new HierarchicalLifetimeManager());container.RegisterType<IBaseRepository<User>, BaseRepository<User>>(new HierarchicalLifetimeManager());  

Approach 2

API Layer

 

 
  1. container.RegisterType<IUserBusiness, UserBusiness>(new HierarchicalLifetimeManager());  

Business Layer

 

 
  1. container.RegisterType<IUserRepository, UserRepository>(new HierarchicalLifetimeManager());container.RegisterType<IBaseRepository<User>, BaseRepository<User>>(new HierarchicalLifetimeManager());  

The screenshot given below explains why the first approach is a better approach,

Injecting Dependencies without any third-party framework

It is not necessary to use any other framework for injecting the dependencies. We can inject dependencies without Unity or any other third party framework.

Go to the class startup.cs and inside the method “ConfigureServices(IServiceCollection services)”, add 2 lines of code given below. 

 
  1. services.AddTransient<IUserManager, UserManager>();  
  2. services.AddTransient<IUserRepository, UserRepository>();   

You can also inject those dependencies from web.config file or from appsettings.json files.

LINK: http://www.c-sharpcorner.com/article/web-api-architecture-and-dependency-injection-best-practices/

Message Broker Pattern using C#

This article outlines the C# code implementation for the popular message broker pattern generally used in common problems which involves brokering of messages with arbitrary type.

Introduction

This article extensively covers the C# code implementation for the message broker pattern typically found as a solution to the message brokering / routing in enterprise software products. A detailed description about the pattern is available at the following URLs:

Background

In many enterprise software products, it is required to route the messages across the components of the product, with the conditions that:

  • the routing of messages should not be type aware, means the component actually routing the messages should not bother about the type of message
  • both the message publisher and message receiver in the route channel should be decoupled, means the publisher doesn't need to be aware who the message subscribers are. Similarly, the subscribers need not be aware of the originator of the message
  • publisher can also be a subscriber for the type of message it intends to receive.

The message broker / exchange is illustrated in the diagram above, wherein the direction of arrow from the component towards the message (A, B, etc.) represents publishing, whereas the arrow direction from message to the component represents subscribing. Further, the publishers are completely transparent from publishing / consuming mechanism as well as the actual consumers.

Please note that the message broker pattern described in this article is for the solution within the process context, and does not describe the brokering / routing of the messages across the distributed systems. For such scale of systems, we already have enterprise message brokers, such as Kafka, Azure Service Bus queue, etc.

Using the Code

Consider the following core level interface, which defines a contract for a message broker. As outlined therein, the method Publish<T>() is a generic publisher method of any payload of type T. Typically, the originator will call this method to publish a message of type T. The method Subscribe<T>() is called by the client to subscribe to a message of type T. Please note that the subscriber hooks the handler action method to receive the message payload and perform the action over it accordingly.

namespace MessageBroker
{
    using System;
    public interface IMessageBroker : IDisposable
    {
        void Publish<T>(object source, T message);
        void Subscribe<T>(Action<MessagePayload<T>> subscription);
        void Unsubscribe<T>(Action<MessagePayload<T>> subscription);
    }
}

The type MessagePayload is a generic type, carrying the original message T. The properties - Who, What and When are the properties describing the source, content and time of publishing respectively. The class is outlined below:

namespace MessageBroker
{
    using System;
    public class MessagePayload<T>
    {
        public object Who { get; private set; }
        public T What { get; private set; }
        public DateTime When { get; private set; }
        public MessagePayload(T payload, object source)
        {
            Who = source; What = payload; When = DateTime.UtcNow;
        }
    }
}

The implementation of the above interface is outlined in the code below, where the broker is implemented as a singleton instance. Please note that the broker needs to be a singleton to ensure that all the messages are routed through that instance only.

namespace MessageBroker
{
    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Threading.Tasks;
    public class MessageBrokerImpl : IMessageBroker
    {
        private static MessageBrokerImpl _instance;
        private readonly Dictionary<Type, List<Delegate>> _subscribers;
        public static MessageBrokerImpl Instance
        {
            get
            {
                if (_instance == null)
                    _instance = new MessageBrokerImpl();
                return _instance;
            }
        }

        private MessageBrokerImpl()
        {
            _subscribers = new Dictionary<Type, List<Delegate>>();
        }

        public void Publish<T>(object source, T message)
        {
            if (message == null || source == null)
                return;
            if(!_subscribers.ContainsKey(typeof(T)))
            {
                return;
            }
            var delegates = _subscribers[typeof(T)];
            if (delegates == null || delegates.Count == 0) return;
            var payload = new MessagePayload<T>(message, source);
            foreach(var handler in delegates.Select
            (item => item as Action<MessagePayload<T>>))
            {
                Task.Factory.StartNew(() => handler?.Invoke(payload));
            }
        }

        public void Subscribe<T>(Action<MessagePayload<T>> subscription)
        {
            var delegates = _subscribers.ContainsKey(typeof(T)) ? 
                            _subscribers[typeof(T)] : new List<Delegate>();
            if(!delegates.Contains(subscription))
            {
                delegates.Add(subscription);
            }
            _subscribers[typeof(T)] = delegates;
        }

        public void Unsubscribe<T>(Action<MessagePayload<T>> subscription)
        {
            if (!_subscribers.ContainsKey(typeof(T))) return;
            var delegates = _subscribers[typeof(T)];
            if (delegates.Contains(subscription))
                delegates.Remove(subscription);
            if (delegates.Count == 0)
                _subscribers.Remove(typeof(T));
        }

        public void Dispose()
        {
            _subscribers?.Clear();
        }
    }
}

The implementation of the message broker interface maintains a centralized dictionary of message type against its list of subscribers. Each Subscribe<T>() call will populate this dictionary with type T as key. Whereas, the call to Unsubscribe<T>() will ensure either the key is removed from the dictionary or the subscribing action method is removed from the list representing the subscribers for the type T.

Points of Interest

The message payload class, as outlined above just represents three properties. But in the enterprise solution, the same payload can be used to carry any additional message attribute(s) as required. Further, the generic type T represents a class. But the message can represent a root class representing a facade of different concrete message instances.

LINK: https://www.codeproject.com/Tips/1169118/Message-Broker-Pattern-using-Csharp