I like books. I like owning them. I like reading them. And re-reading them. They're joys to have. Books are bound to be immortal treasures. Digitizing them is cool and nice and stuffs. But nothing beats the pleasure of experiencing books in print, hardbound, paperback and all.
Biometric authentication is not mainstream. It's expensive. It's hardware specific. It is also not an all-in-one solution. However, if done right, there is a higher chance that the future would be better with it.
Mendz.Graph's Graph class implements IFormattable. The default DOTFormatProvider and DOTFormatter implementations included in Mendz.Graph are really just samples of what can be done. Developers are encouraged to create and use their own IFormatProvider and ICustomFormatter implementations for the Graph.
Not as much fanfare as I expected, but it's finally out! Dapper 1.50.4 for .Net Standard 2.0 is ready! Exciting times ahead for fans of the micro-ORM library.
All Mendz projects and products are now .Net Standard 2.0 class libraries. Use them all in your .Net Core or .Net Framework or .Net Standard targeting projects.
For the longest time, web applications absorb processing loads on the server-side, leaving the client-side to mostly just rendering the response (i.e. HTML/CSS) from the web server. Now, with JavaScript frameworks like Angular, React, Vue, etc., the client-side is sharing the processing load.
When creating ASP.Net applications for business and enterprise requirements, you are more likely doing something nowhere near basic. While simple MVC examples are available everywhere, the truth is, they are more often, if not always, NOT applicable as-is.
To those who own game consoles, you know very well why they should last a long time. Unlike smartphones, tablets, laptops and desktops, which have 3 to 5 year life cycles, game console life cycles are estimated at 6 to 8 years. In my opinion, XBox One X just about got it right.
Imagine a future where you can have a single device from which you can choose whichever network carrier/provider is providing the best service and offerings where you are. Imagine carrier competition on steroids. Possible? Perhaps with eSIM, and with prepaid subscriptions the norm.
Apple SIM has gone a long way from just enabling data to now include making and getting calls as exemplified by the new Apple Watch Series 3 . As I See Tech thinks this is big , and you should too!
Apple released the 3rd-generation of the Apple Watch in September 2017. It's available with GPS only or with GPS+cellular via eSIM. This article is interested in the latter, and As I See Tech thinks it's going to be big!
I am pretty much ending the year with a blast! I have finally published my projects in GitHub and NuGet. I am happy with how they are progressing and how they are getting maintained. There are more work up ahead, but mostly it's a celebration to know that they are finally already out there.
Documentation can be a lot of work, but someone has to do it. API with no documentation is like a new invention not explained. Considering, developing codes and tests are even easier than expressing them in a language that most people will understand. Nonetheless, documentation needs to be done and that's the goal for the rest of the year.
G = (V,E). If that makes sense to you, then Mendz.Graph is for you! If it doesn't make sense to you, and you need to work on graphs (Graph Theory), then Mendz.Graph is definitely for you, too!
Matrix, square matrix, dense matrix, sparse matrix and compressed matrix galore! Create matrices, perform mathematical operations with them, and even compress them using CRS (compressed row storage), CCS (compressed column storage) or CVS (compressed value storage). Mendz.Matrix is a simple and lightweight library for working with matrices.
Mendz.Data is designed with the team in mind. Different developers have different styles. A good team respects each member developer's style. However, a team working on the same project should also mean that all developers agree on many things. At the very least, code completeness, consistency and quality should be predictably maintainable and measureable.
Mendz.Data.* helps teams delegate and separate concerns. Not everything needs to be done in a single project. You can use Mendz.Data to distribute tasks so that deliverables can be created concurrently, tested independently and later assembled/integrated together for functional testing.
The truth is, you can use the Mendz.Data.Repository and Mendz.Data.Repository.Async CRUDS interfaces in your own custom repository implementations. There is nothing in this namespace that requires you to use them only in purely Mendz.Data-based contexts and repositories.
The idea and motivation behind Mendz.Data is to remove the abstraction out of project planning, so that you can jump right ahead to coding and development. Mendz.Data is not just a library of APIs. As advertised, it is designed to provide tools and guidance.
More than a month has passed since I published Mendz.Data. Many developers and teams are using Mendz.Data to guide them in developing contexts and repositories for their projects. Simplicity, consistency, productivity, quality and scalability are key motivators.
Advanced ETL requirements can involve building data from various sources. Mendz.ETL-based joiners allow developers to define advanced query operations on data sources, generating data into a desired format that can be better consumed by the ETL flow.
ETL involves agreements between partners with regards to how they will "speak" with each other. Validating what a partner sends and what you send to your partner is an important ETL requirement. Mendz.ETL supports the creation of validators, which can be called at the start and at the end of an ETL flow.
The first thing you need to know about Mendz.ETL is that, by itself, it does nothing. After all, Mendz.ETL is just a library of APIs. What do you expect? However, what you create with Mendz.ETL, the adapters, mappers, validators and joiners you invent with it, those will do something. And, together, they can be big.
I am very excited to announce Mendz.ETL, a class library that provides APIs to let you create custom/proprietary extract, transform and load (ETL) solutions.
Mendz.Library v1.2.1 introduces MapperExtensions for the IGenericMapper and IStreamingMapper interfaces. IGenericMapper and IStreamingMapper can help in creating consistent data mapping codes and strategies. MapperExtensions adds a couple of features that aim for the same.
Mendz.Library is my class library project for anything that can be used shared and used by various projects. Version 1.2.1 adds new classes, types and extensions that can help spark new ideas and inspirations. Here are the highlights.
Welcome to the new As I See Tech. Well, actually, I've been using this Contempo theme for weeks now. I originally only wanted to test it out for a bit. The way things are, I think that it's working pretty well.
In Mendz.Data.Repository(.Async) CRUDS interfaces, one of the method parameters is "dynamic expansion = null". You may be wondering, what it is? What's it for? Or perhaps even asking why? Well, it's really nothing... until you need to use it. And it can be one of the most powerful features of Mendz.Data.Repository(.Async)'s CRUDS interfaces.
Mendz.Data.ResultInfo is like a POCO of result information. When used with ADO.Net compatible contexts (like those based on Mendz.Data.Common.DbDataContextBase), ResultInfo can be handy in saving input and output values for the caller. Mendz.Data.Common.ResultInfoExtensions provide a couple of useful shortcuts.
There are always cleaner and simpler ways of doing things. This topic is a matter of preference. Consider this approach to clean out some lines of codes from your ASP.Net application's controllers.
Class libraries are meant to be shared. It can be used by web applications. It can be used by services. It can be used by batch programs. With that in mind, it is important to think about contexts and repositories as class libraries, that are meant to be re-used and shared by your current application project and by your future application projects.
ADO.Net provides the basic data access capabilities of .Net. Even with the Entity Framework around, ADO.Net remains solid. Just like how the very basic concept of the typewriter remains alive in our keyboards, ADO.Net is here to stay. Mendz.Data.Common embraces this assumption as fact, in order to provide the APIs needed for creating contexts and repositories.
The Mendz.Data.Common namespace provides the APIs needed to create Mendz.Data-aware contexts and repositories. Here's to understanding the main classes and types in this namespace.
Mendz.Data.PagingInfo represents paging information. The idea is to basically have a POCO that can be used to store and exchange paging data that can be used by backend queries, by middlewares and by frontend rendering codes.
The rational behind Mendz.Data's DataSettings and DataSettingOptions is simple: provide an easy way to get data settings, like connection strings, anywhere during the application's runtime. Note that .Net Core 2.0's configuration and dependency injection features make this possible. DataSettings and DataSettingOptions provide an alternative.
ResultInfo represents result information about a procedure call. It can store the input and output values. These information can be used, for example, to build messages after an operation or a series of operations completed. These messages can be used for logging, or for alerts and notification.
Mendz.Data.Repository defines the CRUDS interfaces. Mendz.Data.Repository.Async defines the async versions of the CRUDS interfaces. The new CRUDS signatures add more features and flexibility. Here are the highlights.
If you followed my Dapper series back in May/June this year, you should know that Mendz.Data 1.1.0, as published in GitHub and NuGet , has major breaking changes. For starters, the new Mendz.Data is a .Net Standard 2.0 class library. Internally, there's more.
I originally thought I needed Mendz.Data.EntityFramework. After some minor re-design and simplification, I realized that all I really needed was just Mendz.Data. Immediately, Mendz.Data.EntityFramework became obsolete.
Mendz.Data.SqlServer, which provides a generic Mendz.Data-aware context for ADO.Net-compatible access to SQL Server databases, is now on GitHub and NuGet!
In Mendz.Library, one of the classes defined is the SingletonBase. The SingletonBase is an abstract or base class for defining singletons. The singleton instantiation is aimed to be thread-safe. Let's look at how it's done.
In Mendz.Library, one of the classes defined is the IDGenerator. The IDGenerator is a thread-safe generator of incremental integer values. It can be seeded to start from a seed number and increment from there. Simple right? Let's look at how it's done.
Publishing to NuGet is quick and easy. Visual Studio 2017 has the basic tools available to create NuGet packages. Just follow the steps, upload and you're good!
Publishing source codes to GitHub is not that difficult. It took me a while to get the gist of it. In the end, the real saving grace is the decision to NOT use Visual Studio Team Services tools and to use GitHub Desktop instead.
When I ask the question "Is Microsoft Fluent Design System the Next Aero?", I mean it in a very daunting way. What I really mean to ask is if Microsoft Fluent Design System would suffer the same fate as the Windows Aero? Crass, I know. But possible? Yes.
Windows has changed its UI/UX many times over the years. The most successful of these is that from Windows 7, for which many seemed to have associated what a desktop experience should be. Windows 10's MDL/2 is the current "modern" incarnation. A change is coming likely next year, and Microsoft calls it Fluent Design System.
Microsoft's Metro Design Language (MDL) was beautiful. Many of its parts are still in Windows 10 today. However, some of the things that made it great already disappeared. Let's try to analyze them and the reasons why.
When Microsoft released Windows Phone 7, it introduced revolutionary user experiences that are far ahead of its time. Well, "far ahead of its time" meant that the target users were not ready for it. Regardless, Metro UI, as it was called, was beautiful!
Mendz projects are kept in an online Visual Studio Team Services account as a private and personal collection. Switching to GitHub and publishing to NuGet is now an option.
SOAP web services are not first class citizens in .Net Core. Not so long ago, Microsoft so eagerly spent a lot of resources to SOAP-based web services, "sold" it as HTTP+XML and even published a lot of libraries, software factories, guidelines and what have you for it. Now... oh, how times have changed!
Microsoft developed a shim technology to allow .Net Framework compatibility with .Net Standard and .Net Core. Before .Net Standard 2.0, the shim was incomplete. If you have .Net Framework 4.x-based class libraries that you want to share with .Net Core 1.x projects, you can create a .Net Standard 1.x class library project to enable cross-use.
I am currently starting to port my class library projects from .Net Core 1.1 to .Net Standard 2.0. So far, and I am happy to report that, the overall migration experience is a breeze!
In application development, adding security features can include requirements for authentication and authorization. Authentication identifies the user. Authorization checks and filters what the user can do. Authentication is mostly easy. Authorization though can be a little bit tricky.
In a world where everything connects to a cloud hosted OS, IoT can be cheaper and simpler. The real focus is fast and reliable networking, because everything really happens in the cloud.
In a world where everything connects to a cloud hosted OS, the smartphone can be cheaper and simpler. There is no need to make them more powerful than an interactive TV. The real focus is on fast and reliable networking, because everything else really happens in the cloud.
If there's anything the iPhone X proves, it's the fact that mobile tech innovation has gone stagnant. It's almost like how our cars are mostly still running on fossil fuel, or that they are still rolling on four air inflated rubber tires. The mobile industry has taken its cue from the automobile industry. Just advertise it as new, even if there's really nothing new, and it will still sell as new.
With the release of .Net Core 2.0 comes great responsibility. It is time to review Mendz's .Net Core 1.1-based projects, to see how well they can all be upgraded to the latest version of .Net Core.
Learning how to use regular expressions can give developers power over data when solving problems with pattern matching, searching, format validations and conversion.
In programming, if there's a "language" that's as crazy as speaking in cryptic sequence of symbols, it's regular expressions. Learning how to use regular expressions can be intimidating. However, once you get over the dyslexic looking syntax, you'll be amazed at how powerful it can be in data formatting, matching, processing, searching and validating programs.
Although SQL has been the language of choice for RDMS, it doesn't mean SQL cannot be applicable to NoSQL databases. It's really all about sets. If a collection can be treated as a set, and a collection of collections a set of sets, SQL should be an easy fit.
The people who promote MongoDB are smart. They don't push MongoDB at you. They help you make intelligent decisions. After all, they know MongoDB is new. They don't just recommend migrating your 10+ years old RDMS-based applications to MongoDB. More likely, they would instead recommend that you consider creating new requirements and applications in MongoDB.
Data collection is highly valued. There are businesses focused on profiting from them. Data analysts and data scientists are the new careers to aim for. It's not just about big data anymore. It's now about really putting them to work, with the ultimate manifestation building what is known as artificial intelligence.
I am a Windows 10 phone user. I own a Nokia Lumia ICON. I love it. With Microsoft seemingly killing off Windows 10 for phones anytime soon, I feel the pain. I probably share the same sentiments as other Windows 10 phone users. Regardless how loyal to the platform we wish to be, it seems inevitable that we shall forever be the last few. Microsoft does not care.
While researching on compressed sparse matrix, I stumbled upon compressed row storage (CRS; compressed sparse row, CSR; Yale format) and compressed column storage (CCS; compressed sparse column, CSC). These sparse matrix compression formats are popular. Stepping back and imagining the possibilities, I considered the possibility of applying basic lossless compression techniques to sparse matrices, exploiting data redundancy, leading to what I call the compressed value storage (CVS) .
Compressed value storage (CVS) applies lossless compression to a matrix resulting to storage that can be smaller than the popular compressed matrix formats like CRS and CCS. In Visualizing Compressed Value Storage (CVS), I described an imagination of using CVS to store and render images. The overall idea is simple. In fact, it's so simple that it seems possible to implement the idea using JSON and HTML5 canvas.
This article describes an imagination of compressed value storage (CVS) being used in graphics storage and rendering. The imagination does not make any assumption that CVS can really be used for digital images. This is basically a spill of thought processes and no codes or implementations are shared. Still interested? Read on...
CVS is coordinate-wise in many ways. However, some operations are challenging. Performing CVS-to-CVS multiplication is possible but with a catch (at least as of this writing). CVS's own rule about keeping distinct non-zero values and aligning their lists of linear indexes creates a challenge when performing operations that can create zeroes or repeating values.
CVS is coordinate-wise in many ways. However, some operations are challenging. Performing CVS-to-CVS addition/subtraction is possible but with a catch (at least as of this writing). CVS's own rule about keeping distinct non-zero values and aligning their lists of linear indexes creates a challenge when performing operations that can create zeroes or repeating values.
The compressed value storage (CVS) is new. I say that because I can't find materials about it or something similar to it online. Or perhaps I am searching with the wrong keywords. Basically, my problem is that there is not much to find about how CVS can be used in matrix operations. So, for now, I have to figure things out on my own.
My current study is to add more features/methods for the compressed value storage (CVS) format. CVS is a lossless compression format for sparse matrices. A lot of things are easier with CVS than CRS/CCS. Let's look at MatrixScalarProduct() and MatrixScalarProductInPlace().
My current study is to add more features/methods for the compressed value storage (CVS) format. CVS is a lossless compression format for sparse matrices. A lot of things are easier with CVS than CRS/CCS. Let's look at Transpose(), SetLinearIndexMode() and TransposeToNewCVS().
There is not much to find online about the compressed value storage (CVS). It seems like CRS and CCS are the most popular matrix compression formats. So here I am writing about CVS a little more. This way, you have something to find online about it.
.Net's LINQ lets developers create applications that can be 100% data source/target agnostic. Let's look at how that can work with SQL and NoSQL databases.
Regardless what some programmers may think, SQL is still the best language to express operations with sets. And, yes, I want to stress exactly that. Sets!
Now it's time to get organized. The way I did it, Mendz.Data is a separate project from Mendz.Data.MongoDB. By default, using Mendz.Data makes the project ADO.Net ready. Adding a reference to Mendz.Data.MongoDB makes the project also MongoDB ready.
Mendz.Data.MongoDB provides the types and classes that can allow developers to use MongoDB in their applications following the same concepts, designs and principles applied by Mendz.Data for ADO.Net compatible data access (like Dapper, for example).
Mendz.Data provided classes and types that allow developers to create ADO.Net compatible data contexts and model "repositories". The primary motivation behind the design is for use with Dapper , a micro-ORM library. If you've read enough about MongoDB , you should know that it is not ADO.Net compatible. Granted, what is the Mendz.Data.MongoDB namespace all about then?
If you can compress them, you can decompress them, right? Decompression helps to validate the compression. In this article, I'll explore how CVS, CRS and CCS can be decompressed to a two-dimensional array T[,] dense matrix.
The CVSExtensions, CRSExtensions and CCSExtensions serve as my placeholders for compressed matrix operations. This article focuses on matrix-vector multiplication.
In .Net, extension methods are fantastic ways of adding new features to types that can share the same behavior (static methods) without necessarily changing the types' source codes. Extension methods can be maintained in separate code files. IDOKSparseMatrixExtensions provides extension methods to sparse matrices that implement IDOKSparseMatrix.
When I discussed about compressed matrices , I mentioned two common compressed matrix formats, CRS and CCS, and described a third one I called CVS. These compressed matrices are supported in Mendz.Library.Matrices.
Mendz.Library.Matrices abstracted the sparse matrix via IDOKSparseMatrix and DOKSparseMatrixBase . Now it's time to create the classes DOKSparseMatrix and DOKConcurrentSparseMatrix. From these, the coordinates keyed and linear index keyed concrete classes will be created. If you've been waiting for it, note that Mendz.Graphs..Sparse matrices use CoordinatesKeyedConcurrentSparseMatrix , which is described in this article.
After much self-deliberation, I decided that the sparse matrix must be abstracted. The primary inspiration of the sparse matrix's design in Mendz.Graphs came from Wolfram where the SparseArray is shown as a collection of coordinates and their values ((0, 0) -> 1, (3, 1) -> 3, (7, 5) -> 5, ...).
I re-organized Mendz.Library and created a separate Mendz.Library.Matrices project for the growing list of matrix-specific types and classes that started popping up while working on matrix compressions. This enhancement is big!
Once upon a time, there was a problem that inspired a quest. That quest created Mendz.Graphs, which is now a .Net Core class library that grew from a simple implementation of G = (V, E) in to one that also provides features to represent a graph as a list, a dense matrix, a sparse matrix and, quite recently, a compressed matrix.
The connection matrix is a (true, false, false)-adjacency matrix, implemented in Mendz.Graphs.Representation.Matrices namespace as the ConnectionMatrix.
In Mendz.Graphs, you should find that the AdjacencyMatrix, WeightedAdjacencyMatrix and GenericAdjacencyMatrix classes are coded pretty much the same way. In fact, they are essentially coded exactly alike, with only the namespaces different. Why is this so, Mendz?
Looking at Mendz.Graphs, you can clearly see how repeatable coding patterns are celebrated. Take note, I didn't say repeating codes. I said, repeatable coding patterns. Maintenance-wise, it makes a lot of difference.
Mendz.Graphs provides a set of concrete sparse matrix implementations. Compared to dense matrices, sparse matrices save on memory by using only what's needed to represent the graph.
The incidence matrix is a rectangular matrix which entries can vary depending on whether the graph is directed or undirected, and whether the incidence matrix is oriented or not. The sparse version applies the same logic used by the dense version.
Come to think of it, the degree matrix and its variations are really sparse. Implementing the degree matrix as a sparse matrix just makes absolute sense!
In Mendz.Graphs, representing graphs as matrices follow similar coding patterns. This is basically because they all derive from the GraphMatrixBase class. Thus, it shouldn't be surprising to see that the codes for sparse matrices look a lot like their dense matrix counterparts.
In the previous series, I covered the dense matrices in Mendz.Graphs, which implement them as two-dimensional arrays T[,]. They work good, really... that is, until you start loading graphs with a very large collection of vertices and edges.
An incidence matrix shows the relationship between a vertex and an edge. An incidence matrix C is a (1, 0)-matrix, where C v,e is 1 if v and e are linked, otherwise 0. There are variations though when the graph is directed or undirected, and when the incidence matrix is oriented or not ( Wikipedia ; Wolfram ).
The degree matrix is defined as a diagonal matrix which contains information about the degree of each vertex — that is, the number of edges attached to each vertex ( Wikipedia ). The degree matrix can be easily defined using Mendz.Graphs..DenseGraphMatrixBase.
Mendz.Graphs..Dense.AdjacencyMatrixBase makes it very easy to define (a, b, c)-adjacency matrices. So much so that I just had to create a GenericAdjacencyMatrix!
Given Mendz.Graphs..Dense.AdjacencyMatrixBase , it's not that difficult to create other types of adjacency matrices. Let's look at two of them: the Seidel adjacency matrix and Laplacian matrix.
In graph theory, the adjacency matrix is defined as a square matrix used to represent a finite graph, where the elements of the matrix indicate whether pairs of vertices are adjacent or not in the graph ( Wikipedia ).
What is the matrix? No, this article is not about the movie . This article actually kicks off a new series on graph theory, graphs and their representations. So far, I've covered representing graphs as an adjacency list . Now, it's time to represent graphs as matrices.
In graph theory, the adjacency list is defined as a collection of unordered lists used to represent a finite graph, where each list describes the set of neighbors of a vertex in the graph ( Wikipedia ). Mendz.Graphs.AdjacencyList implements an enhanced adjacency list that can also be used to retrieve the incidence.
Mendz.Graphs implements the graph theory's definition of the graph. The Graph class provides features to maintain the Vertices, Edges and their respective indexes. It can also be used to generate the graph's DOT notation. So does the Graph class represent a graph? Well, yes and... not exactly.
In Working with Dapper (Part 2) , I showed Mendz.Library.ResultInfo, a class which allows procedures to return a structured message that is rich with data and information. While testing scenarios with using ResultInfo, I found a flaw that needs to be fixed.
A good thing that came out of writing about my own work, it gave me a chance to rediscover and review what I did. When I first wrote Mendz.Graphs around a year ago or so, I was using .Net Framework 4.x. As of this week, after posting about the Graph yesterday, Mendz.Graphs is officially a .Net Core class library project.
The Graph Theory in C# presented the Graph, Vertex and Edge classes to implement the graph theory's definition of a graph: G = (V, E). While reviewing Mendz.Graphs for memory and performance optimization, it went through dramatic changes and enhancements. Previously, I reviewed the Vertex and Edge . This article focuses on the Graph.
The Graph Theory in C# presented the Graph, Vertex and Edge classes to implement the graph theory's definition of a graph: G = (V, E). While reviewing Mendz.Graphs for memory and performance optimization, it went through dramatic changes and enhancements. Previously, I reviewed the Vertex . This article focuses on the Edge.
The Graph Theory in C# presented the Graph, Vertex and Edge classes to implement the graph theory's definition of a graph: G = (V, E). While reviewing Mendz.Graphs for memory and performance optimization, it went through dramatic changes and enhancements. This article focuses on the Vertex.
Eat. Sleep. Code. Repeat. This whatchamacallit can make a developer smile or smirk. For those who experienced it once or so in their career, it can both be both funny and, well, not so funny at all. Whatever it means to you, just know instead that developers live, love and laugh just like everyone else. But I digress. The point at hand really is understanding what it means to "repeat".
If you could give each of your memories a name, it would make each of them more memorable. Giving a name to everything is human nature. It is how we acknowledge and recognize the presence of something or someone. It is how we remember. It is how we communicate. It is how we share.
Logic formulation is fundamental to learning how to code. Although you can learn to code from examples, the ability to solve real world problems can boil down to your ability to apply, build and express logic.
Once in a while, you just want a web page. In ASP.Net's MVC world, that means all the scaffolding and coding for Model, View and Controller. ASP.Net Core 2's Razor Pages simplifies all the plumbing to just getting the page done with the knowledge that it's all MVC underneath.
There's a new product in town that embraced dependency injection to the core: .Net Core (and, therefore, ASP.Net Core ). If you like dependency injection (DI), .Net Core gives you DI overload, like sugar sprinkled on honey with a cherry on top!
Mendz.Data represents the team's "core architecture" for creating applications with a database backend. Mendz.Data (and Mendz.Library) is discussed in the Preparing for Dapper and Working with Dapper series. This article is in response to an inquiry about transactions with repositories based on Mendz.Data.
Dapper inspired me to develop a new "core architecture" for creating database contexts, "repositories" and using them in applications. In my design, the applications are data source agnostic, shielded by the repositories from the intricacies of the data layer. In practice, Dapper is used exclusively in the repository's CRUDS implementations.
In order to guide developers on how to define repositories, I created CRUDS interfaces with expandable signatures and support for structured return values. This article describes how to use the Mendz.Data classes to create a database context, a POCO, a repository and, finally, to use them in an application.
Empowered with a " template " to define the CRUDS interface method signatures, thanks to the ResultInfo class and .Net's dynamic types, I can quickly complete my interface designs.
One of the problems with trying to define interface method signatures is that it can involve a lot of guesswork and, therefore, a lot of risks. At design time, you can only assume so much. A few months after, it's possible that you'll encounter scenarios that might not be supported. Signatures that are not good enough is an interface designer's nightmare. Years down the line, changing an interface is like declaring war!
In the previous series , I talked about how I was able to put together database context and "repository" classes for what the team calls the new "core architecture". In Part 4 of that series, I mentioned about the CRUDS interfaces, but I did not expand on them. This new series covers the CRUDS interfaces themselves, and how they can be used to complete the repositories.
A new "core architecture" has been defined for the team. IDbDataContext and DbRepositoryBase are designed to shield the applications from the data source and from the underlying technologies of the data layer. As far as the applications are concerned, they know only of the POCOs and the repositories.
So far, I have the POCOs and the database context. If I let the application access the database context directly, it binds the application to the database context. This breaks the goal of making sure that the application is data store agnostic. What's needed is a mediator between the database context and the application itself.
The team decided that they would create POCOs to represent data models. It is the first step to making applications data source agnostic. The second step is to actually define the objects that would shield applications from the data source.
It's time to replace a 10 year old "core architecture" with a modern version. The team decided on continuing to use stored procedures with the intent of keeping all business rules and logic in one place. Instead of going for Microsoft's Entity Framework, the team opted to use Dapper, a micro-ORM with lots of promise.
When I got myself in to writing a new library for data access layers, NOT choosing Microsoft's Entity Framework was an easy decision. Likewise, micro-ORM Dapper, was an easy choice.
In a world filled with jargons, software developers never run out of new ones. MVC, ORM, UoW, SPA, SignalR, GitHub, NuGet, DevOps, etc. If you don't catch up, you can get lost in figuring which new one is a concept, a technology, a product or a service.
The repository pattern is one of the most misunderstood patterns in software design and engineering. Although Martin Fowler was clear about what it is and what it's for, the developer community interpreted it as something simpler than what it really is.
The Graph needs to be smart and self-validating. Likewise, it should be lightweight. The finishing touches give the Graph its proper constructor and a method that ultimately ends my quest.
When I started this quest, I found out that the graph in graph theory is complicated. However, focusing on the graph's definition made it look so simple in C#. After working on the Vertex and the Edge classes, I realized that I would need to expand my Graph class to something smarter.
Graph is a class with two main properties: a list of Vertex objects and a list of Edge objects. Vertex is a class with two main properties: an ID and a Value. Edge is a beast. Creating a class of type Edge is an exercise of self-control and avoiding temptations. As redundant as it may sound, the Edge can literally take you to the edge.
The Vertex is a class that has properties ID and Value. Both ID and Value must be set. This posed a problem because, in the real world, not everything has an ID. It should be possible to create a Vertex with both ID and Value available. However, it should also be possible to create a Vertex with just the Value, the Vertex internally assigning it a virtual ID.
Let's start with the Vertex. Wikipedia defines a vertex as the fundamental unit of a graph. If you read through the article, it expanded on how the vertex is used in a graph instead of actually describing what a vertex is and what, by itself, a vertex represents. I needed a definition that's less wordy and more specifically focused on the vertex itself. In some other illustrations, the vertex is described as a junction or a region. Again, these are vague. It seemed like the vertex cannot be defined without separating how a graph can be visualized.
Focusing on the basic definition of the graph guided my quest to a good path. G = (V, E) means that, a graph is a collection of vertices and edges. As we all can see now, there are three objects involved here: the graph G itself, the vertices V and the edges E. But what are vertices? And what are edges? In the same way that I wanted my graph to be true to the definition, I needed my vertices and edges to implement the "official" definitions as well.
In my quest to create my own graph theory library in C#, I started out to be conventional to being radical. I searched the Internet how programmers were doing it and got overwhelmed by the varied approaches. So I started reading about graph theory and got just as drowned with TMI (too much information). Finally, I realized that my problem would be easier to solve if I just focused on the first page of my research: the definition of the graph.
Many developers agree that the best way to express the world in code is through object-oriented programming, or OOP. OOP basically represents everything as objects. In fact, modern OOP languages have the "object" as the base or root class of everything, including the traditionally primitive types like char, integer, float, double and Boolean. From the most basic to the most complicated, they are all just objects in OOP.