Scala: a structure similar to a map that does not require casting when extracting a value?

I am writing a data structure that converts the results of a query to a database. The raw structure is a Java ResultSet, and it will be converted to a map or class that allows access to various fields in this data structure by calling the named method or passing a string to apply (). Obviously, different values ​​can have different types. To reduce the load on clients of this data structure, I prefer that you do not need to enter data structure values, but the value that was selected is of the correct type.

For example, suppose I make a query that retrieves two column values, one of them, the other a row. The result of the column names is then "a" and "b", respectively. Some ideal syntax might be as follows:

val javaResultSet = dbQuery("select a, b from table limit 1") // with ResultSet, particular values can be accessed like this: val a = javaResultSet.getInt("a") val b = javaResultSet.getString("b") // but this syntax is undesirable. // since I want to convert this to a single data structure, // the preferred syntax might look something like this: val newStructure = toDataStructure[Int, String](javaResultSet)("a", "b") // that is, I'm willing to state the types during the instantiation // of such a data structure. // then, val a: Int = newStructure("a") // OR val a: Int = newStructure.a // in both cases, "val a" does not require asInstanceOf[Int]. 

I tried to determine which data structure could allow this, and I could not figure out how to cast.

Another requirement, obviously, is that I would like to define a single data structure used for all db queries. I understand that I can easily define a case class or similar for each call and which solves the input problem, but such a solution does not scale well when many db requests are written. I suspect that some people are going to suggest using some kind of ORM, but let's say for my case that it is preferable to support the query as a string.

Anyone have any suggestions? Thanks!

+4
source share
6 answers

Joschua Bloch introduced a heterogeneous collection that can be written in Java. I once took it a bit. Now it works as a case of values. This is basically a wrapper around two cards. Here is the code , and this is how you can use it, But it is only FYI, since you are interested in the Scala solution.

In Scala, I would start playing with Tuples. Tuples are heterogeneous collections. Results can be, but should not be available through fields such as _1, _2, _3 , etc. But you don’t want it, you need names. Here's how you can give them names:

 scala> val tuple = (1, "word") tuple: ([Int], [String]) = (1, word) scala> val (a, b) = tuple a: Int = 1 b: String = word 

So, as mentioned earlier, I tried to build a ResultSetWrapper around tuples.

0
source

To do this without casting, you need more information about the request, and you need this information at compile time.

I suspect that some people are going to suggest using some kind of ORM, but let's say for my case that it is preferable to support the query as a string.

Your suspicion is correct, and you will not get along. If your current ORM or DSL, such as squeryl, doesn't suit your imagination, you can create your own. But I doubt that you can use query strings.

+2
source

The main problem is that you do not know how many columns will be in any given query, and therefore you do not know how many types of parameters the data structure should have, and it is impossible to abstract the number of type parameters.

However, there is a data structure that exists in different versions for different numbers of parameters of the type: tuple. (For example, Tuple2, Tuple3, etc.). You can define parameterized display functions for different numbers of parameters that return tuples as follows:

 def toDataStructure2[T1, T2](rs: ResultSet)(c1: String, c2: String) = (rs.getObject(c1).asInstanceOf[T1], rs.getObject(c2).asInstanceOf[T2]) def toDataStructure3[T1, T2, T3](rs: ResultSet)(c1: String, c2: String, c3: String) = (rs.getObject(c1).asInstanceOf[T1], rs.getObject(c2).asInstanceOf[T2], rs.getObject(c3).asInstanceOf[T3]) 

You will need to define them for the number of columns that you expect to have in your tables (max. 22).

It depends, of course, on the fact that using getObject and casting it to a specific type is safe.

In your example, you can use the resulting tuple as follows:

 val (a, b) = toDataStructure2[Int, String](javaResultSet)("a", "b") 
+1
source

if you decide to follow the path of heterogeneous collections, there are some very interesting posts in heterogeneous typed lists:

one for example

http://jnordenberg.blogspot.com/2008/08/hlist-in-scala.html

http://jnordenberg.blogspot.com/2008/09/hlist-in-scala-revisited-or-scala.html

with implementation at http://www.assembla.com/wiki/show/metascala

the second large series of messages begins with

http://apocalisp.wordpress.com/2010/07/06/type-level-programming-in-scala-part-6a-heterogeneous-list%C2%A0basics/

the series continues with parts "b, c, d" associated with part a

finally there is a Daniel Spivak conversation that affects HOMaps

http://vimeo.com/13518456

so all this is to say that perhaps you can build your decision from these ideas. I wish I had a concrete example, but I admit that I have not tried it myself yet!

+1
source

If you want to "extract the value of a column by name" in a regular bean instance, you can:

  • Usage also reflects CAST, which you (and I) do not like.
  • use the ResultSetToJavaBeanMapper provided by most ORM libraries, which is a bit heavier and related.
  • write a scala compiler plugin that is too complicated to manage.

therefore, I think a lightweight ORM with the following features may satisfy you:

  • raw SQL support
  • supports lightweight, declarative and adaptive ResultSetToJavaBeanMapper
  • nothing else.

I did a pilot project on this idea, but notice that this is still an ORM, and I just think that this may be useful to you or may give you some hint.

Using:

declare model:

 //declare DB schema trait UserDef extends TableDef { var name = property[String]("name", title = Some("姓名")) var age1 = property[Int]("age", primary = true) } //declare model, and it mixes in properties as {var name = ""} @BeanInfo class User extends Model with UserDef //declare a object. //it mixes in properties as {var name = Property[String]("name") } //and, object User is a Mapper[User], thus, it can translate ResultSet to a User instance. object `package`{ @BeanInfo implicit object User extends Table[User]("users") with UserDef } 

then call raw sql, the implicit Mapper [User] works for you:

 val users = SQL("select name, age from users").all[User] users.foreach{user => println(user.name)} 

or even create a secure request like:

 val users = User.q.where(User.age > 20).where(User.name like "%liu%").all[User] 

see unit test:

https://github.com/liusong1111/soupy-orm/blob/master/src/test/scala/mapper/SoupyMapperSpec.scala

House project:

https://github.com/liusong1111/soupy-orm

It uses "abstract type" and "implicitly" to do the magic, and you can check the source code of TableDef, Table, Model for details.

0
source

Several million years ago, I wrote an example showing how to use a Scala system for push and pull values ​​from a ResultSet . Check this; this corresponds to what you want to do close enough.

 implicit val conn = connect("jdbc:h2:f2", "sa", ""); implicit val s: Statement = conn << setup; val insertPerson = conn prepareStatement "insert into person(type, name) values(?, ?)"; for (val name <- names) insertPerson<<rnd.nextInt(10)<<name<<!; for (val person <- query("select * from person", rs => Person(rs,rs,rs))) println(person.toXML); for (val person <- "select * from person" <<! (rs => Person(rs,rs,rs))) println(person.toXML); 

The primitive types are used to guide the Scala compiler to select the correct functions in the ResultSet.

0
source

All Articles