The current full-stack web development era is filled with a plethora of front-end and back-end frameworks. Understanding the difference between server and client rendered web application development frameworks will be useful to discern and choose the best framework and tool best suited for you. Lets understand this in the context of the most popular web development framework – ASP.NETCore.
ASP.NETCore is the popular open-source web development framework which is the latest flavor of ASP.NET which comes with the latest .NETCore framework. As we know, .NETCore is the popular opensource development framework for building secured, scalable software application over web, mobile, IOT and other various platforms. ASP.NETCore is the specific piece that contains libraries that allows full-stack web development via .NET.
A web application in its core, allows user / application (simulated) interactions via a front end / client user-interface (UI) for data entry, reporting/analysis, and other kinds of information management where data is transmitted to the web server for processing at various interaction points and the results being rendered back to the client / UI. Historically this has been fulfilled via the traditional CGI interfaces (if you are old enough as me or cared to study the history). Later along the line, scripting languages such as ASP, JSP, Perl, PHP, Python, Ruby, etc was/is being used. These web development technologies and frameworks have evolved into their own latest versions and flavors which presents various options for the web-developer to make the application fit to her/his needs.
Server rendered frameworks
Towards Client rendered frameworks
Blazor model web apps. might not be suitable for all your complex use cases and workflows. This means that both server side and client side rendered web application development frameworks have its own use cases, pros and cons. The goal of this article was to bring into light the difference of approaches and your options to select between the two based on your need.