ADO.Net Entity Framework Posts Related Posts by ThinqLinq

ThinqLinq Samples on Github

When I first created this site, I used it both as a test bed for technologies as I played with them and as a repository for others to use for my presentations and other insights. Over the years people have been free to grab these samples, but haven’t had good ways of giving back to help improve the samples.

In the mean time, there have been quite a number of technology changes. While I may have been late to the party on some, there’s only so many things one can focus on. One of the advances we’ve seen is the explosion of use of web based distributed source control systems. In that light and as an effort to both make it easier for you to access the sample projects, and help you give back, I’ve started publishing my samples on GitHub. While you’ll still be able to download the samples from my downloads page, I won’t be updating them as new versions ship and instead will plan to keep the samples updated on GitHub instead.

At this point, the following projects are available via my Github repository:

  • RX Samples – Includes Windows Phone, Silverlight, Windows Store App, WebForm, HTML/RxJs, and SignalR samples with code in VB, C# and JavaScript.
  • WebAPI Samples – Includes Sample recipe EF 6 models and Web API endpoints.
  • Signalr RX Samples – Includes Chat samples in Web and WPF, Reactive Sensors in Web and Console projects, and Drag/Drop with MVC and WPF

As always, please let me know what you Thinq. Does Git make accessing the samples easier? Is there something else I should add to the samples?

Posted on - Comment
Categories: C# - Entity Framework - JavaScript - LINQ - Rx - RxJs - SignalR - VB - WinRT - WP7 -

Left Outer Joins in LINQ with Entity Framework

As I spend more time reviewing code with clients and on public forums, I’m constantly seeing cases where people have issues with Outer joins in LINQ and the various flavors of LINQ. In this post, I’m going to look at a couple options from a syntax perspective that you can use to make working with outer joins easier with LINQ. Naturally, we have a much deeper discussion of outer joins in our book that you’re welcome to dig into as well.

Typically, if you do a join in LINQ, it will perform an inner join where it only returns the records that contain results in both sides of the evaluation (in both tables). If a child table doesn’t have a record for a parent row, that result will be excluded. Consider the case in Northwind where an Order record exists, but there are no order detail lines. The following LINQ query won’t return that record as part of its result set:

from o in Orders
join od in OrderDetails on o.OrderID equals od.OrderID
select new {o, od}

This translates into the following SQL statement (Note to DBA’s that are concerned by the * below: EF does list the individual columns in the actual SQL. I’m reducing them to * for the purposes of this post to focus on the join clauses):

SELECT 
    [Extent1].*, [Extent2.*
FROM  [dbo].[Orders] AS [Extent1]
    INNER JOIN [dbo].[OrderDetails] AS [Extent2] ON [Extent1].[OrderID] = [Extent2].[OrderID]

If you want to include the orders regardless of whether it has any detail lines, you would need to turn this inner join into an outer join using the DefaultIfEmpty extension method. LINQ only supports left outer joins. If you want a right outer join, you need to flip the logic of your query to turn it into a left outer join. In order to use the DefaultIfEmpty, you typically need to push the joined set into a temporary value first and then select from it using the DefaultIfEmpty method:

from o in Orders
join innerOD in OrderDetails on o.OrderID equals innerOD.OrderID into Inners
from od in Inners.DefaultIfEmpty()
select new {o, od}

This generates the expected LEFT outer join as shown below:

SELECT 
    [Extent1].*, [Extent2].*
FROM  [dbo].[Orders] AS [Extent1]
    LEFT OUTER JOIN [dbo].[OrderDetails] AS [Extent2] ON [Extent1].[OrderID] = [Extent2].[OrderID]
The problem that I have with this is that the syntax seems overly verbose to accomplish this change. As is often the case, Microsoft often gives multiple ways to accomplish the same goal. One method that I’ve started to find helpful is to revert to more of an ANSI 82 style syntax where the joins were accomplished in a where clause instead of a join clause. By mixing and matching the LINQ query comprehensions and lambda syntax, we can restate the above query as follows:
from o in Orders
from od in OrderDetails
    .Where(details => o.OrderID == details.OrderID)
    .DefaultIfEmpty()
select new {o, od}

If we check the SQL, we can see that this generates the same SQL as the more verbose version above. Often when I’m doing this in application code, I’ll put the where and DefaultIfEmpty on the same line to let me focus on what I’m fetching from, not how I’m joining them unless I need to focus on that.

There is an issue with using this syntax when joining nullable values or strings. Since the where statement doesn’t know about the cardinality relationship (1-Many, 0-1 – Many, etc), Entity Framework adds an additional check where the nullable value is not null to allow for join cases where both sides have a null value. Changing our query to use Northwind’s Customers and Orders which joins on the string CustomerID values, we can write the following LINQ query which is nearly identical to the one we created before for Orders and OrderDetails:

from c in Customers
from o in Orders
    .Where (o => o.CustomerID == c.CustomerID)
    .DefaultIfEmpty()
select new {c, o};

This results in the following SQL statement.

SELECT 
    1 AS [C1], 
    [Extent1].*, [Extent2}.*
    FROM  [dbo].[Customers] AS [Extent1]
    LEFT OUTER JOIN [dbo].[Orders] AS [Extent2] ON ([Extent2].[CustomerID] = [Extent1].[CustomerID])
       AND ([Extent2].[CustomerID] IS NOT NULL)

Note that in this case, we have an additional clause checking for Extent2.CustomerID IS NOT NULL. This may seem innocuous, but I have found in at least one case for the query execution to be significantly slower due to the use of an index scan rather than an index seek caused by this clause. Here you have to be careful when crafting your queries and monitor performance even more carefully to avoid performance bottlenecks unnecessarily.

While this version works better, I still prefer to use associations rather than joins to think about the problem from more of an object graph perspective rather than set based operations. As long as you have a natural association between the entities I much prefer using the associations to navigate through than to have to worry about building out the joins manually each time. We can restate our join even more simply as follows using associations:

from o in Orders
from od in o.Order_Details.DefaultIfEmpty()
select new {o, od}

Note that if you omit the DefaultIfEmpty clause, you would get an Inner join rather than left outer join.

If you have other ways of creating outer joins in LINQ that you prefer, let me know what you thinq.

Posted on - Comment
Categories: C# - Entity Framework - LINQ -

Custom Logging with Entity Framework EF6

Coming from LINQ to SQL, I’ve long been a fan of its logging simplicity where you just set the context.log to a stream (console.out) to see the stream of sql statements being issued to the database. Prior to EF6, this has been a rather frustrating omission that has finally been rectified with EF 6. Now, you can log SQL statements just as simply, but with a slightly different implementation. In EF, you set the DbContext’s Database.Log property to an Action delegate that takes a string as input parameter. Thus to log requests to the Diagnostics Trace implementation, you simply set the following:

context.Database.Log = Sub(val) Diagnostics.Trace.WriteLine(val)

This is fine as long as you are willing to accept the default logging implementation. If you want to customize the log output, things get a bit trickier. In my example, I only want to log the calling method and elapsed time that the query took to execute. I’m not as concerned in my current need for the SQL string, but that is easily included as I’ll point out below.

To start, I need to set the calling method from my code that I’ll be able to access in the logging implementation. Since the calling code is actually in ASP.Net Web API calls returning IQueryables, and the database is not being executed until long after my application code has completed. I need to explicitly identify the application calling method rather than the last method from the call stack which would otherwise be one of the Web API internal methods. To handle this and centralize my set-up logic for the context, I’ll create a factory method which configures the context and use that instead of relying on the context’s constructor. Thanks to partial classes, I can extend the generated entity class and include this new property and factory constructor. To make matters even easier, I’ll take advantage of the CallerMemberName attribute to automatically pull the member name from the method that is calling the factory.

Public Class NorthwindEfEntities

    Public Shared Function ContextFactory(<CallerMemberName> Optional memberName As String = "") As NorthwindEfEntities
        Dim context = New NorthwindEfEntities()
        context.CallingMethod = memberName
        context.Database.Log = Sub(val) Diagnostics.Trace.WriteLine(val)
        Return context
    End Function

    Public Property CallingMethod As String

End Class

Now, to create the context we call this new ContextFactory method, but don’t pass the memberName explicitly. The compiler will add that for us automatically.

    Public Function GetCustomers() As IQueryable(Of DTO.DtoCustomer)
        cn = NorhwindEfEntities.ContextFactory
        ' Do some amazing query and return it.
   End Function

Now that we’ve set up the logger, we need to customize the output that is generated. To do this, we need to add a new class that derives from DatabaseLogFormatter (in the System.Data.Entity.Infrastructure.Interception namespace). If you don’t have this namespace, you may need to upgrade to EF6 on order to access the logging functionality. Since the base class doesn’t have a default parameterless constructor, we need to make a new one and simply delegate to the base implementation. With that out of the way, we can supply our own logging. The base implementation gives us hooks to the following interception points.

Method Description
LogCommand Writes the SQL Statement to the action implementation prior to executing the statement
LogResult Writes when the SQL statement has completed
LogParameter Writes the parameter(s) used in the query
Executing/Executed Called before and after the database request is made
NonQueryExecuting/NonQueryExecuted Called for queries that don’t return results (insert/update/delete and non-result return stored procedures)
ReaderExecuting/ReaderExecuted Called for queries that return tabular data results (select)
ScalarExecuting/ScalarExecuted Called for queries that return single value results (user defined functions)
Write Base write implementation that each of the loggers use to format the output.

As stated before in our example, we want to write a log entry when the command completes including the calling method name and execution time. To do this we need to perform the following adjustments to the base implementation: 1) suppress logging the SQL statements in the LogCommand method by simply creating a noop and not delegating to the base implementation, and 2) replace the default result log information with our custom output in the LogResult method. To get the elapsed time, we directly leverage the Stopwatch property of the base class. Here then is our new custom formatter.

Imports System.Data.Entity.Infrastructure.Interception
Imports System.Data.Entity

Public Class CustomDbLogFormatter
    Inherits DatabaseLogFormatter

    Public Sub New(context As DbContext, writeAction As Action(Of String))
        MyBase.New(context, writeAction)
    End Sub

    Public Overrides Sub LogCommand(Of TResult)(command As Common.DbCommand, interceptionContext As DbCommandInterceptionContext(Of TResult))
        'MyBase.LogCommand(Of TResult)(command, interceptionContext)
    End Sub

    Public Overrides Sub LogResult(Of TResult)(command As Common.DbCommand, interceptionContext As DbCommandInterceptionContext(Of TResult))

        Dim context = interceptionContext.DbContexts.OfType(Of NorthwindEfEntities).FirstOrDefault()
        If context IsNot Nothing Then
            Trace.WriteLine(context.CallingMethod + " Completed in " + Stopwatch.ElapsedMilliseconds.ToString)
        End If
   
    End Sub
End Class

One step remains in order for this new logging implementation to take effect. We need to register the custom logger in our solution. To do this manually in code, we create a new class and derive from DbConfiguration. Then in the constructor, call the base class’s SetDatabaseLogFormatter to set the formatter to our new CustomDbLogFormatter. For more information on configuring the logging via your config file rather than in code, see AJ Vicker’s post today on EF 6.1 turning on SQL logging without recompiling your app.

Public Class LogConfiguration
    Inherits DbConfiguration

    Public Sub New()
        SetDatabaseLogFormatter(Function(context, action) New CustomDbLogFormatter(context, action))
    End Sub
End Class

Naturally, the value that we are outputting is just an example, you’re free to make your logging as complex as you want given these basic building blocks. For more details, make sure to read the EF team’s posts on the logging implementation.

Posted on - Comment
Categories: Entity Framework - VB Dev Center -

Logging ASP.Net WebApi elapsed time for IQueryables

I’ve recently been doing a fair amount of work with exposing queryable structures using ASP.Net Web API. In a nutshell, Web API allows you to expose not only the data from a service request, but also the query structure that can be extended. By default, the Web API scaffolding generates Get(id), Get, Post, Put and Delete methods. Of these, the most interesting, from a query perspective, is the Get method that returns a list of values:

    Public Function GetCustomers() As IEnumerable(Of Customer)
        Dim query = cn.Customers
        Return query
    End Function

By returning the default IEnumerable, the database will return everything that matches the service query definition and the consuming application isn’t able to customize this view to meet their needs. If you apply a couple changes, you can enable much greater service flexibility and allow customers to access your resources the way they want rather than just the way you originally intended. Of course with great power comes great responsibility. To make our GetCustomers method queryable, we simply change the output type from IEnumerable to IQueryable.

    Public Function GetCustomers() As IQueryable(Of Customer)
        Dim query = cn.Customers
        Return query
    End Function

In addition, we need to enable the query support for Web API in the Web Api Config’s Register method in the App_Start folder:

Public Module WebApiConfig
    Public Sub Register(ByVal config As HttpConfiguration)
        ' Web API configuration and services

        ' Web API routes
        config.MapHttpAttributeRoutes()

        config.Routes.MapHttpRoute(
            name:="DefaultApi",
            routeTemplate:="api/{controller}/{id}",
            defaults:=New With {.id = RouteParameter.Optional}
        )
        config.EnableQuerySupport()
        
    End Sub
End Module

Now, if we run the service, we can not only get the full list of Customers, but also add querying options like filter (where), orderby, top, skip, etc. For example, the following query gets the first two customers from London ordered by the Company Name. For a complete list of the various url query options, refer to the OData Query Specification URI Convention page.

http://localhost:17413/api/Customer?$filter=City%20eq%20'London'&$orderby=CompanyName&$top=2

Now, how does this fit with the question of logging the requests? If we were to use a traditional logging mechanism, we could just wrap the beginning and end of the service request with a timer and output the difference.

    Public Function GetCustomers() As IQueryable(Of DTO.DtoCustomer)
        Dim sw = New Stopwatch
        sw.Start()

        Dim query = cn.Customers
        sw.Stop()
        System.Diagnostics.Trace.WriteLine("GetCustomers completed in " + sw.ElapsedMilliseconds.ToString)

        Return query
    End Function

This outputs the result: “GetCustomers completed in 1”. Why is the time to execute so short? Remember that with LINQ, the database request is deferred until the results are iterated over (GetEnumerator). In the case of an IQueryable with WebAPI, the query isn’t executed until after our code has returned the base query and WebAPI has had a chance to add on any query parameters that the user supplied. We no longer have access to the query pipeline to hook into, right?

Adding tracing

Not exactly, Mike Wasson pointed out that there is a TraceWriter implementation that you can leverage and extend to hook into the Web API pipeline. For more information, see his article on Tracing in ASP.Net Web API.

In our example, we can output all of the tracing messages as follows. One thing to watch here is that the Kind and Operation are not set until the traceAction is executed, thus you need to execute the action prior to logging the details.

Imports System.Web.Http.Tracing Imports System.Diagnostics Public Class ServiceTracer Implements ITraceWriter Public Sub Trace(request As Net.Http.HttpRequestMessage,
category As String,
level As System.Web.Http.Tracing.TraceLevel,
traceAction As Action(Of TraceRecord)) Implements ITraceWriter.Trace

Dim rec = New TraceRecord(request, category, level) traceAction(rec) System.Diagnostics.Trace.WriteLine(String.Format("{0} - {1} {2} - {3}", category, level, rec.Kind, rec.Operation)) End Sub End Class

We then need to register this implementation with the WebAPI configuration (again in WebApiConfig):

Public Module WebApiConfig
    Public Sub Register(ByVal config As HttpConfiguration)
        ' Web API configuration and services

        ' Web API routes
        config.MapHttpAttributeRoutes()

        config.Routes.MapHttpRoute(
            name:="DefaultApi",
            routeTemplate:="api/{controller}/{id}",
            defaults:=New With {.id = RouteParameter.Optional}
        )
        config.EnableQuerySupport()
        config.Services.Replace(GetType(ITraceWriter), New ServiceTracer)

    End Sub
End Module

If we run our solution again and access our service, we should see the following in the output window:

System.Web.Http.Request - Info  Begin - 
System.Web.Http.Controllers - Info  Begin - SelectController
System.Web.Http.Controllers - Info  End - SelectController
System.Web.Http.Controllers - Info  Begin - CreateController
System.Web.Http.Controllers - Info  Begin - Create
System.Web.Http.Controllers - Info  End - Create
System.Web.Http.Controllers - Info  End - CreateController
System.Web.Http.Controllers - Info  Begin - ExecuteAsync
System.Web.Http.Action - Info  Begin - SelectAction
System.Web.Http.Action - Info  End - SelectAction
System.Web.Http.ModelBinding - Info  Begin - ExecuteBindingAsync
System.Web.Http.ModelBinding - Info  End - ExecuteBindingAsync
System.Web.Http.Filters - Info  Begin - OnActionExecutingAsync
System.Web.Http.Filters - Info  End - OnActionExecutingAsync
System.Web.Http.Action - Info  Begin - InvokeActionAsync
System.Web.Http.Action - Info  Begin - ExecuteAsync
System.Web.Http.Action - Info  End - ExecuteAsync
System.Net.Http.Formatting - Info  Begin - Negotiate
System.Net.Http.Formatting - Info  Begin - GetPerRequestFormatterInstance
System.Net.Http.Formatting - Info  End - GetPerRequestFormatterInstance
System.Net.Http.Formatting - Info  End - Negotiate
System.Web.Http.Action - Info  End - InvokeActionAsync
System.Web.Http.Filters - Info  Begin - OnActionExecutedAsync
System.Web.Http.Filters - Info  End - OnActionExecutedAsync
System.Web.Http.Controllers - Info  End - ExecuteAsync
System.Web.Http.Request - Info  End - 
System.Net.Http.Formatting - Info  Begin - WriteToStreamAsync
System.Net.Http.Formatting - Info  End - WriteToStreamAsync
System.Web.Http.Controllers - Info  Begin - Dispose
System.Web.Http.Controllers - Info  End - Dispose

Boy, that’s a lot of information, but not quite what we want. If you scan through this list, you can see that WebAPI essentially gives us hooks before and after each operation that it does, including selecting the controller from the request URI, serializing any input/body parameters, executing the service, getting the output formatter, formatting the output, and writing the response. In our case, all we want is to log which method was called and how long it took to execute. Instead of wrapping a stopwatch around the beginning and end of our query definition as we did before, we need to find the time difference between the beginning and end of the service request. Given the trace list above, we can see that we can catch the begin SelectAction operation and the corresponding End Dispose operation and log that.

For the solution, we need to set-up a list of TraceRecord objects that we can use to correlate the start and end operations. We’ll use a private items variable to track the start requests. In the write trace method, we can check the Kind and Operation to add the record for the Begin of the SelectAction operation. We also check for the End Dispose operation to write the timings. In the end method, we need to find the item based on the supplied RequestId. (Remember, WebAPI may have multiple service requests executing simultaneously with a single trace writer instance, so you have to take care to get the correct start and end.) The TraceRecord already has a convenient TimeStamp property which is set when the traceAction is executed.  We simply calculate the difference between the start and end timestamps to output the total execution time. After we output the log record, we make sure to remove the starting trace record from the items list to clean up after ourselves. Below is our revised implementation of the trace writer to log the total execution time for the request.

Public Class ServiceTracer Implements ITraceWriter Dim items As New List(Of TraceRecord) Public Sub Trace(request As Net.Http.HttpRequestMessage,
category As String,
level As System.Web.Http.Tracing.TraceLevel,
traceAction As Action(Of TraceRecord)) Implements ITraceWriter.Trace

Dim rec = New TraceRecord(request, category, level) traceAction(rec) WriteTrace(rec) End Sub Private Sub WriteTrace(rec As TraceRecord) If rec.Kind = TraceKind.Begin AndAlso rec.Operation = "SelectAction" Then items.Add(rec) End If If rec.Kind = TraceKind.End AndAlso rec.Operation = "Dispose" Then WriteServiceEnd(rec) End If End Sub Private Sub WriteServiceEnd(rec As TraceRecord) Dim start = items.Find(Function(r) r.RequestId = rec.RequestId) If rec.Request IsNot Nothing AndAlso start IsNot Nothing Then System.Diagnostics.Trace.WriteLine(String.Format("{0} - {1}: elapsedTime={2}", Now, rec.Request.RequestUri.PathAndQuery, (rec.Timestamp - start.Timestamp).TotalMilliseconds)) items.Remove(start) End If End Sub End Class

Posted on - Comment
Categories: Entity Framework -

Hierarchical Trees from Flat Tables using LINQ

I’m often tasked with creating a tree representation of a structure from a flat self-referencing table. For example, in the EF extensions to Northwind, they extended the Employee table so that it has a self-referencing “ReportsTo” column. As you can see from the data below, Andrew Fuller does not report to any other employees, but Nancy, Janet, Margaret, Steven, and Laura all report to Andrew (because their ReportsTo value is the same as Andrew’s EmployeeID). Likewise Michael, Robert, and Anne all report to Steven.

image

In order to generate a tree representation of these records, we could start with any records that have no direct reports and then lazy load each of their children. Unfortunately for large graphs, the number of database hits will grow exponentially as we add tree levels. Typically with parent-child relationships, we could eager load the children using the DataLoadOptions with LINQ to SQL or .Includes with Entity Framework. However with self-referencing entities, this isn’t allowed. With LINQ to SQL, you will get an InvalidOperationException “Cycles not allowed in LoadOptions LoadWith type graph.”

So, how do we load the tree in one pass and build the object graphs? It’s really not that hard once you realize how reference types (classes) work in .Net. Let’s start by creating a holder for each employee and their associated direct reports:

Public Class HierarchicalEmployee
    Public Sub New(emp As Employee)
           Model = emp
    End Sub
    Public Property Model As Employee
    Public Property DirectReports As New List(Of HierarchicalEmployee)
End Class

Now that we have this type, we can fill it using a simple LINQ request. In order to optimize the next step, we’ll push the values into an in-memory Dictionary indexed by the EmployeeID:

Dim allEmployees = Employees.
    Select(Function(emp) New HierarchicalEmployee(emp)).
    ToDictionary(Function(emp) emp.Model.EmployeeID)

Next, we iterate over the full list. For records that have a ReportsTo value, we’ll add their object pointer to their parent’s DirectReports list:

For Each emp In allEmployees.Values
  If emp.Model.ReportsTo.HasValue Then
    allEmployees(emp.Model.ReportsTo.Value).DirectReports.Add(emp)
  End If
Next

Notice, here we take advantage of the Dictionary’s hash rather than having to iterate over the list each time we want to find the parent record. Finally, instead of returning the full list, we only return the employees that don’t have any children (where the ReportsTo is null).

 
Dim rootEmployees = allEmployees.Values.
    Where(function(emp) Not emp.Model.ReportsTo.HasValue())

If we want to test this out in LinqPad, just use the Dump method on the resulting rootEmployees. As a result, you’ll see the following in the results pane. Notice Andrew is the only root object. He has 5 direct reports and one of his reports had 3 reports. You could just as easily bind this to a treeview control or output it using your favorite UI tooling.

 

image

The nice thing about this solution is that if we check the generated SQL, we will just see a simple (single) SQL request to generate the entire graph. As a summary, here’s the complete code from the LinqPad sample:

 

Sub Main
    Dim allEmployees = Employees.
        Select(Function(emp) New HierarchicalEmployee(emp)).
        ToDictionary(Function(emp) emp.Model.EmployeeID)

    For Each emp In allEmployees.Values
        If emp.Model.ReportsTo.HasValue Then
            allEmployees(emp.Model.ReportsTo.Value).DirectReports.Add(emp)
        End If
    Next
    
    Dim rootEmployees = allEmployees.Values.
        Where(function(emp) Not emp.Model.ReportsTo.HasValue())
        
    
    rootEmployees.Dump
End Sub

Public Class HierarchicalEmployee
    Public Sub New(emp As Employee)
           Model = emp
    End Sub
    Public Property Model As Employee
    Public Property DirectReports As New List(Of HierarchicalEmployee)
End Class
Posted on - Comment
Categories: VB Dev Center - LINQ - Entity Framework -

Aggregate clause issues

I was reviewing a Stack Exchange message regarding the Aggregate clause in VB () where they found that the query was issuing multiple requests to the database and occasionally returning the entire database table to memory and using LINQ to Objects over the result. I also found that Frans Bouma blogged about this back in 2008 at  . Consider the following LINQ query over Northwind:

Dim query = Aggregate o in Orders
                   into Sum(o.Freight),
                   Average(o.Freight),
                   Max(o.Freight)

This produces the following TSQL Statements in EF. Notice here that the Sum and Avg are performed on the server, but the Max pulls the entire table to memory and does Max on the client. It would seem that this is an issue in the expression tree parser.

SELECT
[GroupBy1].[A1] AS [C1]
FROM ( SELECT
   SUM([Extent1].[Freight]) AS [A1]
   FROM [dbo].[Orders] AS [Extent1]
)  AS [GroupBy1]

GO

SELECT
[GroupBy1].[A1] AS [C1]
FROM ( SELECT
   AVG([Extent1].[Freight]) AS [A1]
   FROM [dbo].[Orders] AS [Extent1]
)  AS [GroupBy1]

GO

SELECT
[Extent1].[OrderID] AS [OrderID],
[Extent1].[CustomerID] AS [CustomerID],
[Extent1].[EmployeeID] AS [EmployeeID],
[Extent1].[OrderDate] AS [OrderDate],
[Extent1].[RequiredDate] AS [RequiredDate],
[Extent1].[ShippedDate] AS [ShippedDate],
[Extent1].[ShipVia] AS [ShipVia],
[Extent1].[Freight] AS [Freight],
[Extent1].[ShipName] AS [ShipName],
[Extent1].[ShipAddress] AS [ShipAddress],
[Extent1].[ShipCity] AS [ShipCity],
[Extent1].[ShipRegion] AS [ShipRegion],
[Extent1].[ShipPostalCode] AS [ShipPostalCode],
[Extent1].[ShipCountry] AS [ShipCountry]
FROM [dbo].[Orders] AS [Extent1]

For comparison, here’s the queries issued from LINQ to SQL:

SELECT SUM([t0].[Freight]) AS [value]
FROM [Orders] AS [t0]
GO

SELECT AVG([t0].[Freight]) AS [value]
FROM [Orders] AS [t0]
GO

SELECT [t0].[OrderID], [t0].[CustomerID], [t0].[EmployeeID], [t0].[OrderDate], [t0].[RequiredDate], [t0].[ShippedDate], [t0].[ShipVia], [t0].[Freight], [t0].[ShipName], [t0].[ShipAddress], [t0].[ShipCity], [t0].[ShipRegion], [t0].[ShipPostalCode], [t0].[ShipCountry]
FROM [Orders] AS [t0]

Interestingly, if you use From instead of Aggregate, the expression tree parsers seem to be able to handle this better. For example, the original query could be re-written as follows:

Dim query = From o in Orders
                  Group By key = 0
                  into Sum(o.Freight),
                  Average(o.Freight),
                  Max(o.Freight)

This produces the following SQL (using LINQ to SQL):

SELECT SUM([t1].[Freight]) AS [Sum], AVG([t1].[Freight]) AS [Average], MAX([t1].[Freight]) AS [Max]
FROM (
    SELECT @p0 AS [value], [t0].[Freight]
    FROM [Orders] AS [t0]
    ) AS [t1]
GROUP BY [t1].[value]

For the time being at least, I have to agree with Frans that it is best to avoid using the Aggregate keyword in VB when accessing a database. I’ll update this if I hear any updates that are on the horizon.

Posted on - Comment
Categories: VB Dev Center - LINQ - Entity Framework -

LINQ to Database Performance hints

Although I’ve been writing primarily about RX here for a while, I still dig into some good old LINQ to SQL / EF quite a bit. Along the way, I’m frequently finding more tidbits to throw in the toolbox and thought I’d share some observations from a recent performance tuning project. (I’ll avoid disclosing the client and will change the database model to Northwind for this discussion protect the guilty.)

In this project, I was brought in at the end of the project as things started going south to try to get it across the finish line. As with many projects, they waited until after the development was complete before they started to consider the performance impact of their code. This violates one of the principal tenants of performance testing:

RULE 1: Test early and often

Performance for larger scale apps needs to be considered from the beginning of the project and measured. If you don’t have an ongoing way of measuring the performance as the application progresses through the development lifecycle, you will find it difficult to pinpoint specific performance bottlenecks over time.

As an example, several years ago I met with core members of the Microsoft CLR team. They shared the fact that they compare performance on each night’s builds and if they find more than a couple microsecond deviations in the performance, the teams were required to identify the source of the performance degradation. When working with systems that have millions of lines of code, if the waited a week, it may become hard to impossible to identify the multiple places where performance has now dropped by 1/100th of a second. Over time these performance penalties build up.

I don’t mean to indicate that your Mom & Pop web site needs to live up to this stringent of a testing standard. Also, I’m not insinuating that you should prematurely over optimize your code, but try to make intelligent decisions and have ways of determining when your decisions negatively impact your performance up front.

Why do I bring this up in the context of LINQ to DB (SQL/EF)? If you’re not testing and monitoring your app, you may find that a nice abstraction layer that you added just killed your performance. In the case of the client’s application, they had nicely abstracted away the ability to load various database structures into a generic repository. They had separate methods to GetCustomerById, GetOrdersForCustomerId, GetOrderDetailsForOrderId, etc. They also had helper methods for validation including ValidateIsCustomerNameUnique. The downside here is that reading through the code, it wasn’t easy to notice where hidden database calls were being sprinkled through the code. This brings up rule number 2 for performance testing with databases:

RULE 2: Profile your application

Profiling your application for external requests, including services is essential to make sure the requests you are making are not causing a performance bottleneck. I highly recommend using some sort of profiling tool, particularly for people who are new to LINQ. LINQ makes it easy to make silly mistakes like n+1 requests to a database when navigating lazy-loaded parent-child relationships.

There are plenty of options for profiling applications. I identified some of them back in my LINQ Tools post. Some of these require adding code to your assemblies, while others simply attach to running processes.

If you have a license for SQL Server, you probably have access to SQL Profiler. To use this tool, create a new profile pointing it to your database and run your application. Once you have executed the code you want to profile, you can view all of the requests made against that database. Pay particular attention to cases where the same SQL is issued multiple times in succession or sequence. The downside here is that you will need to manually trace through your code base to find where each request is made in order to fix the issues.

Another alternative if you have the Ultimate SKU of Visual Studio is to use the Intellitrace feature added in Visual Studio 2010. Using Intellitrace, you can not only identify which SQL requests were issued to the database. You can also select each line in the results and navigate to the line of code that caused the request to be issued. Let’s consider this fairly innocuous code snippet to output the Order date and Product Names associated with each order:

   1:  Using model As New NwindEntities
   2:      For Each o In model.Orders.Take(3)
   3:          Console.WriteLine(o.OrderDate)
   4:          For Each od In o.Order_Details
   5:              Console.WriteLine(od.Product.ProductName)
   6:          Next
   7:      Next
   8:  End Using

Running this and putting a breakpoint on line 8, we can see the following Intellitrace output:

image

From here, we can see that as we iterate each order item, we are making a separate request to the database  for each associated order detail record to get the related product object. Clicking on any of the lines of the output will take you directly to the line of the code that the request was made. Unfortunately, you won’t be able to view the parameters for each request using Intellitrace as you could with SQL Profiler, but that’s less important from a performance tuning perspective than it is to know that you have multiple excessive requests to the database. If you find that the intellitrace output becomes cluttered with non-database related trace items, I often find it helpful to filter the results to only ADO.Net requests:

image

Fortunately, fixing the code above only requires changing line 2 in the code above to eagerly load the child records:

For Each o In model.Orders.Include("Order_Details").Include("Order_Details.Product").Take(3)

We may want to further optimize this request by not hydrating the entire objects, but rather just fetching the rows that we want as well:

Using model As New NwindEntities
    Dim query = From o In model.Orders
                Select o.OrderDate,
                    ProductNames = From od In o.Order_Details
                                   Select od.Product.ProductName

    For Each o In query.Take(3)
        Console.WriteLine(o.OrderDate)
        For Each p In o.ProductNames
           Console.WriteLine(p)
        Next
    Next
End Using

If you don’t have access to SQL Profiler or Intellitrace, consider one of the other relatively inexpensive profiling tools out there like the MVC MiniProfiler, ORM ProfilerHuagati’s LINQ to SQL Profiler, EF Prof, or at a bare minimum checking the generated code for your queries using LinqPad.

With your application now profiled, hopefully you won’t run into the issues such as found in the following code (which I did see in the customer’s app changing the model to again protect the guilty:

Debug.WriteLine(Customers.FirstOrDefault().CompanyName)
Debug.WriteLine(Customers.FirstOrDefault().ContactName)
Debug.WriteLine(Customers.FirstOrDefault().ContactTitle)
Debug.WriteLine(Customers.FirstOrDefault().Phone)
Debug.WriteLine(Customers.FirstOrDefault().Fax)
Dim CustomerOrders = Customers.FirstOrDefault().Orders
For Each order in CustomerOrders
    ' Load the order into a POCO
Next

Quick quiz boys and girls, without using a profiler how many requests are we making to our database here? Remember that not only will you have a database hit for each call to FirstOrDefault which doesn’t use deferred loading, but you’ll also get a call on GetEnumerator (called internally in the For Each iteration). Thus the count is 7 for the above code, right? Actually, it’s worse because hidden behind the Debug.WriteLine is a trace writer which also writes the value to the database’s log table. As a result we actually have 12 database requests (7 reads and 5 writes) instead of the single query that we should have used. In this case we’re breaking rule #3:

RULE 3: Don’t fetch needlessly

In the above code, we can simply fetch our target customer once and automatically include their associated Orders as follows:

Dim firstCustomer = Customers.Include("Orders").FirstOrDefault()
Debug.WriteLine(firstCustomer.CompanyName)
Debug.WriteLine(firstCustomer.ContactName)
Debug.WriteLine(firstCustomer.ContactTitle)
Debug.WriteLine(firstCustomer.Phone)
Debug.WriteLine(firstCustomer.Fax)
Dim CustomerOrders = firstCustomer.Orders
For Each order in CustomerOrders
    ' Load the order into a POCO
Next

In this case we fetch the firstCustomer once and reuse it rather than calling FirstOrDefault repeatedly. We also use the .Include option to eagerly fetch the child records. In this case, I’m saddened that the original developer didn’t use a bit of brain power to eliminate those extra database hits FOR LOGGING PURPOSES, but assume that it was because they weren’t aware of when their database was being hit. Of course, this brings us back to rule 2 – profiling.

Simply removing excessive database hits will almost always improve your code performance. In one case, I had a situation where a request was taking 10 minutes. After removing the excessive hits, the performance came down to 10 seconds, which was a definite improvement. However, 10 seconds still does not make the application web scale as the database’s CPU is pegged for that amount of time. Sometimes, it is actually best to break the process up a bit to improve performance.

RULE 4: Break up complex queries

To give you an idea of the kind of query that we’re talking about here, consider the following case where we are fetching the employees in USA including their regions and sales information.

Dim empOrders = 
    From e In Employees
    Where e.Country = "USA"
    Select 
        e.FirstName,
        e.LastName,
        Regions = From t In e.Territories
                  Select t.Region.RegionDescription
                  Distinct,
        TopThreeSales = From o In e.Orders
                        From od In o.OrderDetails
                        Select
                            od.Product.ProductName,
                            TotalSale = od.Quantity * od.UnitPrice
                        Order By TotalSale Descending

While this code will compile and run, the performance will start to suffer as larger volumes of data are added to the database. The reason is that both Territories and Orders are child collections from Employees. As a result, SQL will return the Cartesian product between the two sets. In other words, if a single employee is associated with 5 territories and has sold 10 products, the total number of rows returned would be 50. The OR/M is then responsible for splitting out those results again into the correct groupings. If you multiply this against 10,000 employees, you will find that there is a massive amount of excess data that is returned.

In this case, It may be helpful to split your sub queries up into separate database requests. You could do something like the following using the Contains clause to pass the Id’s of the parent record in to the subsequent  queries.

Dim empOnly = 
    (From e In Employees 
    Where e.Country = "USA" 
    Select 
        e.EmployeeID, 
        e.FirstName, 
        e.LastName). 
    ToList()

Dim EmpIds = empOnly.Select(Function(e) e.EmployeeID)

Dim TopThreeSales = 
    From o In Orders
    Where EmpIds.Contains(o.EmployeeID)
    From od In o.OrderDetails
    Select 
        o.EmployeeID,
        od.Product.ProductName,
        TotalSale = od.Quantity * od.UnitPrice
    Order By TotalSale Descending

However “Contains” has a couple of hidden surprises that limit our use here. First if you use Contains, you can not make it into a compiled query because the number of parameters vary at run-time. Second, if you have more than 2100 items in your EmpIds collection, you will run into a hard limit in SQL Server which only allows up to 2100 parameters. In this event, it is better to re-apply the original filter and return the new result sets for each of the subsequent queries. In the end we can join the separate queries back together again using LINQ to Objects:

Dim empOnly = 
    (From e In Employees
    Where e.Country = "USA"
    Select 
        e.EmployeeID,
        e.FirstName,
        e.LastName).
    ToList()

Dim Regions = 
    (From e In Employees 
    Where e.Country = "USA"
    From t In e.Territories
    Select 
        e.EmployeeID,
        t.Region.RegionDescription
    Distinct).
    ToList()

Dim TopThreeSales = 
    (From e In Employees 
    Where e.Country = "USA"
    From o In e.Orders
    From od In o.OrderDetails
    Select 
        o.EmployeeID,
        od.Product.ProductName,
        TotalSale = od.Quantity * od.UnitPrice
    Order By TotalSale Descending).
    ToList()
                
Dim combined = 
    From e In empOnly
    Select 
        e.FirstName,
        e.LastName,
        EmpRegions = Regions.Where(Function(reg) e.EmployeeID = reg.EmployeeID),
        Sales =  TopThreeSales.Where(Function(sale) e.EmployeeID = sale.EmployeeID)
    

In the above referenced project, I was able to take some queries that as a single LINQ statement took 10 seconds to run on the database down to sub-second requests. Your mileage may vary using this technique. If at all unsure, refer back to Rule 2: Profile.

RULE 5: Use ORM by default but Stored Proc where necessary

At times you will find that the generated query is too complex and the database has issues trying to parse the generated SQL. The complexity of your query, particularly the number of joins and depth of the object graph/inheritance model you are traversing can cause issues. In these cases, I have no objections to using Stored Procedures in order to wrangle the otherwise unruly queries. LINQ to DB is great for the 70-80% of crud operations, but there are times, particularly when reporting when you need something else. Thankfully LINQ to SQL and LINQ to EF both support consuming stored procedures when the need arises without the need to write the tedious and potentially error prone custom ADO code yourself.

In addition, LINQ is not a replacement for ETL operations. In one case, we had a situation where saving a single record with 2000 children caused the database to churn for 10 minutes due to 2001 individual hits to the database for the update process (on a single SaveChanges call). We re-wrote that operation using BulkCopy and brought the operation down to 1 second from 10 minutes.

RULE 6: Use Appropriate Sample Data

The last item for today is to make sure when developing to have representative quantities of data in your sample as you will have after a couple of years of production data enters your system. We found a number of cases where complex queries like I described in rule 4 above would perform fine in development where we had thousands of rows of data, but when we applied it against the production data which had millions of rows of data, the performance died. The complex joins which worked fine against smaller data sets no longer worked against the bigger sets. If we had a good approximation of data volumes in development, we would have been able to diagnose and fix this issue before shipping the version to production.

That’s just a couple of my rules of thumb which have helped me diagnose and fix performance issues with LINQ to SQL/EF. If you have any tricks or techniques to add, please let me know what you Thinq.

Posted on - Comment
Categories: LINQ - VB Dev Center - Entity Framework -

LINQ to Entity Visualizer

When demonstrating the LINQ tools, I typically start out showing the LINQ to SQL visualizer that’s available with the C# Samples. Today I saw that Raja Venkatesh has released a Visualizer for ObjectQuery<T> (aka. LINQ to Entities). As you do with the other visualizers, you enable this by simply saving the AnySourceEntityQueryVisualizer.dll to your Visualizers directory. )Note: :the download page specifies to copy it to your C:\Program Files (x86)\Microsoft Visual Studio 9.0\Common7\Packages\Debugger\Visualizers. However, Windows 7 blocks saving files there by default. Alternatively, you can copy it to your My Documents\Visual Studio 2008\Visualizers.

Once you install it, put a breakpoint after your query is declared. If you hover over the query variable, you should see the magnifying glass indicating the debugger visualizer:

image

Clicking on the magnifying glass brings you to a screen that shows the Expression, projected SQL, and Connection properties. If you click “Execute”, you will see the server log and a grid with the results.

image

If you want to try it out, feel free to download it from the Visual Studio Gallery page for the LINQ to Entity Visualizer. I’ll be adding a linq to this on my LINQ Tool list as well.

Posted on - Comment
Categories: Entity Framework - VS 2008 - Visual Studio -

LINQ to SQL enhancements for 2010

One question that I'm asked repeatedly is, "Is LINQ dead?" The quick answer is, NO. The more appropriate question is "Is LINQ to SQL dead?" That one is a bit trickier, particularly after some of the blog posts from the data programmability team regarding their emphasis moving forward for LINQ to SQL and the Entity Framework.

My take on it is that LINQ to SQL is in a similar situation to Winforms. Both are still supported and have teams dedicated to them, but don't expect much in terms of new features, rather the focus is on bug fixes. The main development focus for new features is placed more on the Entity Framework and similarly WPF. For those who love taking SAT tests, you can think of it as

LINQ to SQL : Entity Framework :: Winforms : WPF

Proof of the point, Damien Guard, one of the people on the LINQ to SQL team, just posted a list of 40+ changes that are coming for LINQ to SQL for .Net 4.0. Looking through the list, most of the items are bug fix items and not feature enhancements. 

Of the list of items, the biggest change I see is specifying the text parameter lengths. Adding this eliminates some of DBA's performance concerns in terms of query plan reuse. This is one area that DBA's have focused on in terms of the performance issue and was an unfortunate oversight in the initial release.

There are a number of larger features that people continue to ask for that are not being included.

  • Support for more complex table to object relationships. This is really the point of the EDM, so I don't see this ever making it into LINQ to SQL.
  • Ability to update a model from database schema changes. Again, the EDMX designer supports this, so I wouldn't hold your breath.
  • No support for SQL 2008's new data types, including the Spatial types, HierarchyId and Filestream. If any of the features were added, I would expect this to be included at some point.

Damien's list should be proof that LINQ to SQL is not dead, but it isn't going to receive significant enhancements that you may want. Check out the 2010 beta and see what changes are there for yourself and give feedback to the teams before it's too late.

Posted on - Comment
Categories: Entity Framework - LINQ - Visual Studio -

ADO.NET Entity Framework Documentation Samples in VB

Last week, I announced that my translations of the Entity Framework samples were available in VB. Today the ADO.Net team announced that next set have been posted. These are part of the ADO.Net Entity Framework Documentation Samples. These are the projects that are used in the EF quick start and walkthroughs that come with the .Net documentation. They are a set of mini applications demonstrating using EF within the context of an application.

The Course Manager sample was previously translated, but the HR Skills, Adventureworks data binding and research and collaboration tool were just updated today. Unlike the other samples, these don't have separate downloads for each sample, but rather have both C# and VB versions included with each download. Here's the description of each of these projects as taken from the MSDN site:

  • CourseManager.zip
    The CourseManager Windows forms application created by completing the Entity Framework quickstart.
  • HRSkillsCombined.zip
    This is a Visual Studio 2008 solution that contains both a Windows forms project and an ASP.NET project. Both samples demonstrate data binding to entity objects. The ASP.NET sample uses the EntityDataSource control for data binding.
  • AdWksSalesWinDataBind.zip
    The AdventureWorks Data Binding sample demonstrates data binding that uses the Entity Framework. This application displays and modifies SalesOrderDetail entities associated with SalesOrderHeader entities.
  • ResearchCollaborationAssistant.zip
    The Annotation and Research Collaboration Tool aids research and collaboration by creating reference annotations and contact entities that can be searched for both relevant Web pages and people associated with topics or search texts.

I hope you find these samples helpful. I'm not sure that I would recommend using the manual databinding that the team used when creating these samples as there are quite a few cases where they could have relied on native databinding rather than manually adding items to text boxes. These translations were fairly literal translations on purpose.

If you're just wanting to learn the query syntax and see the capabilities of EF, the Entity Framework Query Samples are a better source of information.

Thanks to both the VB Team and Data Team for recognizing the need for these samples in VB.

Posted on - Comment
Categories: Entity Framework - VB Dev Center - VB - ADO.Net Data Services -

Entity Framework Samples in Visual Basic

For those Visual Basic users out that that have struggled with the fact that the samples were only available in C#, you can now rejoice. There are a number of projects that have now been translated into Visual Basic for your learning pleasure. You can find these samples on the MSDN Code Gallery’s Entity Framework page. At this point the following projects have been translated.

  • Entity Framework Query Samples Compatible with .NET Framework 3.5 SP1 and Visual Studio 2008 SP1 (Visual Basic and C# versions available)
    The Entity Framework Query Samples is a small Windows Forms program that contains several basic Entity SQL and LINQ to Entities queries against that NorthwindEF Entity Data Model (based on a modified version of Northwind). Its goal is to help you learn the features of the two query languages supported by EF and visualize how the results and the translated store query look like.
  • Entity Framework Lazy Loading Compatible with .NET Framework 3.5 SP1 and Visual Studio 2008 SP1 (Visual Basic and C# versions available)
    This sample shows how to use code generation to add support for transparent lazy loading to Entity Framework. It includes code generator (EFLazyClassGen), supporting library (Microsoft.Data.EFLazyLoading) and sample test applications.
  • Persistence Ignorance (POCO) Adapter for Entity Framework V1 Compatible with .NET Framework 3.5 SP1 and Visual Studio 2008 SP1 (Visual Basic and C# versions available)
    EF POCO Adapter enables Plain Old CLR Objects (POCOs) to be tracked using released version of Entity Framework V1 using automatically generated adapter objects. It consist of a code generator, supporting library and a test suite and examples.

There are several more on the way. I’ll try to update this post when they become available.

5/22: Update: Two more samples went online today:

  • EF Extensions Compatible with .NET Framework 3.5 SP1 and Visual Studio 2008 SP1 (Visual Basic and C# versions available)
    The ADO.NET Entity Framework Extensions library includes utilities that make querying stored procedures, creating typed results from DB data readers and state tracking external data much easier in the Entity Framework. A sample application demonstrates several patterns using these utilities, including stored procedures with multiple result sets, materialization of CLR types, and registering entities in the Entity Framework state manager.
  • ADO.NET Data Services IUpdateable implementation for Linq to Sql
    Sample implementation of ADO.NET Data Services IUpdateable interface for Linq to Sql Data Sources.
Posted on - Comment
Categories: Entity Framework - VB - VB Dev Center - LINQ - ADO.Net Data Services -

Julie Lermans Programming the Entity Framework

I had the pleasure of reviewing an early release of Julie Lerman's Programming the Entity Framework. Julie just reported that it has just been sent to print. Having published LINQ in Action last year, I know what a relieve it can be to get a project of this size out the door.

It is a good read full of information that is only possible to gain by using it in real life applications. She's done a good job not only of presenting the framework, but also giving recommendations on how to use it in a variety of application environments. Special attention is given to the trials and tribulations of building n-tier applications with the Entity Framework.

If you are interested in taking your LINQ knowledge and moving up to the Entity Framework, I recommend checking it out. Head on over to her blog to find out how to pre-order your copy.

Posted on - Comment
Categories: Entity Framework -

Changing the Namespace on generated Entity Framework classes

As I was preparing a presentation recently, I started hitting my head into a brick wall when trying to change the namespace on the entities generated by the Entity Framework. Having spent so much time with LINQ to SQL, I was anticipating that the behavior would be similar enough to make this easy. Unfortunately, I was mistaken.

First, what I tried to do wrong. With LINQ to SQL, if you click the unused design surface, you can set the namespace for the Context separately from the Entities through the property window of the dbml file.

The Entity designer window with the edmx file also has a namespace option in the property window. In this window you can set the Namespace for the ConceptualEntityModel's Schema. Notice from the image here, that the help indicates that this should be "The namespace for the Entity Data Model." Unfortunately, this does not mean that it will be the namespace used by the classes that the code generation tool uses when it generates the actual VB or C# classes that represent the conceptual model.

After jumping back and forth between the XML and classes and the Class View, trying to build and rebuild the project, I got very frustrated because the namespace that I specified appeared in the EDMX file, but the generated classes were not appearing with the correct namespace. I finally gave up and sought out a bit of help. Shawn Wildermuth came to the rescue and pointed me to the property window of the NWindModel.Edmx file itself rather than the the property window of an area in the design surface.

On the property window for the .Edmx file we need to set the value for the Custom Tool Namespace. Once that is done, our classes will be generated in the namespaces that we expected.

Looking a bit deeper at this provides some interesting information. The LINQ to SQL dbml file saves the EntityNamespace and ContextNamespace in the Database element:

<Database Name="Northwind" EntityNamespace="Linq" ContextNamespace="Linq" Class="NWind2DataContext" xmlns="http://schemas.microsoft.com/linqtosql/dbml/2007">

In contrast, the namespace that we set on the custom tool namespace is not included within the EDMX at all. Instead, it is contained in the actual project file (vbproj or csproj):

<ItemGroup>
   <
EntityDeploy Include="NWindModel.edmx">
      <
Generator>EntityModelCodeGenerator</Generator>
      <
LastGenOutput>NWindModel.Designer.cs</LastGenOutput>
      <
CustomToolNamespace>NWindBo.NWindModel</CustomToolNamespace>
   </
EntityDeploy>
</
ItemGroup>

Posted on - Comment
Categories: Entity Framework - LINQ -

Win7 and the LINQ to SQL and LINQ to Entity designers

I've been playing with the Windows 7 Beta1 since they came out. So far, I've been really impressed with what I'm seeing. I've installed quite a bit and have put a full development environment on it including Visual Studio 2008 and SQL Server 2008.

So far, the only real thing I've seen is a small bug in the LINQ to SQL and LINQ to Entity Framework design surfaces. In these, if you move the mouse over one of the entities on the designer, the entity may dissappear.

There's a quick work-around for this issue. Simply move the vertical scroll bar down. Once you do that, the entities will remain visible on the design surface. Rest asured, the development team is aware of this glitch and I'm sure there will be a fix for it by the time that the bits ship.

Posted on - Comment
Categories: Entity Framework - LINQ - VS 2008 -

ADO.Next and the Entity Framework

Last month, a document describing ADO.Next was published and quickly pulled. A new version was just posted to http://msdn.microsoft.com/data/. I read the original document, but neglected to save a copy of it before it was pulled. Skimming over the new ADO.Next document and the ADO Entity Framework documents, that were posted on 6/15, I see that there appear to have been a number of significant changes made in the last month. I hope to read through them more thoroughly in the near future and comment on them at that point. At least this time, I remembered to save and print the documents.

From the looks of things, the new documents look much more promising and allude to some new tool sets to make mapping relational data structures to object heirarchies much easier. One tool appears to be similar to the BizTalk mapper (including the functoids) to create XML based mapping between the tables and objects. They also appear to be extending the DLINQ designer to allow for virtual views, or objects which are based on multiple data structures. Both of these are good things in my opinion and may help us move more easily from the table centric mindsets typical of people who see the simple demos and think that they are indicative of the kinds of applications we should build in real-life.

Posted on - Comment
Categories: Entity Framework -