IdbConnection vs SqlConnection

When I write the application, I use the System.Data interfaces (IDbConnection, IDbCommand, IDataReader, IDbDataParameter, etc.). I do this to reduce vendor dependencies. If I am not doing a simple test application, it just looks like an ethical thing when you consult.

However, it seems that all the code that I see uses the System.Data.SqlClient namespace classes or other provider-specific classes. In magazines and books it is easy to chalk up this influence of Microsoft and their marketing to program only against SQLServer. But this is similar to almost all of the .NET code that I see uses certain SQLServer classes.

I understand that specific classes of providers have more functionality, for example, adding a parameter to the SqlCommand object is one of the methods where adding it to IDbCommand is an annoying 4+ lines of code. But then again; writing a small helper class for these constraints is quite simple.

I also wondered if programming with interfaces when SQLServer is the current target client is over engineering, as it is not required immediately. But I do not think that this is due to the fact that the cost of programming against interfaces is so low that, since reducing dependence on suppliers provides such a huge benefit.

Do you use specific data classes or provider interfaces?

EDIT: To summarize some of the answers below, and think about how I read them.

Possible problems with the use of interfaces for neutrality of suppliers:

  • Provider keywords built into your SELECT statements (all my inputs, upd, and del are in procs, so this is not a problem)
  • Direct database binding is likely to raise questions.
  • If your connection instantiation is centralized, a particular provider class will need to be called anyway.

Positive reasons for using interfaces:

  • In my experience, the ability (even if not implemented) to switch to another supplier has always been appreciated by the customer.
  • Use interfaces in reusable libraries.
+6
design-decisions
source share
6 answers

There are differences in the SQL that you intend to provide to the classes, depending on the type of database engine you are talking to, so even if you manage to write all your code to use the interfaces, you still need to write several sets of SQL.

Or you can do what we have done, write your own layer that takes care of all the syntaxes that we use, and this is not all the syntax provided by different mechanisms, but enough to manage one SQL that gets correctly rewritten before execution.

Basically, we created a function syntax where we prefix function names with SQL:: , which allows our code to identify special functions that need to be rewritten. Then they are parsed and rewritten properly, even if the order of the arguments is changed if necessary.

Small things, such as the name of a function that returns the current date and time of the server, can be performed, but also large things, such as how to select the first N lines of the query.

In addition, the parameters for SQL Server are written as @name, where OleDb uses positional (just add “where you want the parameter”), and our code also handles these differences.

This pays off in the sense that we don’t really care about the SQL that needs to be written.

+3
source share

One thing to keep in mind is the real chance that you will ever switch databases. In most cases, this will never happen. And even if that happens, it will be a serious overhaul, even if you use classes that are database neutral. In this case, it is probably best to use that which has more functionality and will help you complete the project faster.

I believe that in most cases you should use a layer that you create yourself above the actual .Net API, so if you ever have to change the classes you should use, this will not be so much of a problem. Even if you remain in the same database, you never know when you will have to switch the way you access the database. Anyone who has moved from ASP to ASP.Net (ADODB vs. ADO.Net) can tell you how painful this is.

Therefore, I believe that the best solution is to use a more functional API with specific databases, as well as create your own layer on top of it so that you can easily replace it if necessary.

+4
source share

I wrote something similar to what Lasevk said, but he beat me before him.

Also, at some point you need to talk to a real database. You may be able to get around most of your data access using the interface code, and this is a good idea. But in the end, if you are using SQL Server, you need to create a real instance of SqlClient.SqlConnection. The same goes for Access, Oracle or MySQL.

+2
source share

I use provider-specific classes where I do the direct db job (unless they tell me that we can use another database - the number of times that happened: once).

However, if I end up canceling some non-discrete code into separate classes, I will often use the method parameter interfaces, especially if I think it will be useful in other projects, and I do not rely on any specific features of the direct access class.

Basically, to rephrase Einstein: Make the solution as simple as possible, but not simpler.

+1
source share

You should try to write code database agnostic. You may not find it useful now, but you will probably use it in the future.

+1
source share

Take a look at the Microsoft Patterns and Practices corporate library. The data access code that they have shows good implementations of vendor independence.

http://msdn.microsoft.com/en-us/library/cc467894.aspx

0
source share

All Articles