I have been experimenting with
PostgreSQL 9.2's native json datatype, and wanted to have
Play Framework 2.1-RC1 automatically convert the raw
PostgreSQL JDBC PGobject object into a
Play-specific JsValue object. Neither Play nor JDBC understand PostgreSQL's native
json datatype, so neither library provides an implicit conversion.
Automatic conversion can be performed by defining a parameter extractor. Without a parameter extractor, a compilation error will occur:
could not find implicit value for parameter extractor: anorm.Column[play.api.libs.json.JsValue]
In my case, I could have worked around this by extracting the value as a
String and parsing this value. Note that because the
data field is marked
NOT NULL, I don't need to use an
Option[String] in the extractor.
get[Pk[Long]]("sample.id") ~
get[String]("sample.data") map {
case id~data => Sample(id, Json.parse(data))
}
For
INSERTs and
UPDATEs, I'm using the
PostgreSQL CAST keyword as I haven't experimented with how to get the
JsValue object converted transparently into a JDBC object (presumably of type
PGobject) or
defining a new varchar → json CAST on the database side.
Without the
CAST() the
INSERT fails because the JDBC driver doesn't set the proper type:
org.postgresql.util.PSQLException: ERROR: column "data" is of type json but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
Here is the sample schema:
CREATE TABLE
sample
(
id serial PRIMARY KEY,
data json NOT NULL
);
Here's how a
Sample model looks:
package models
import anorm._
import anorm.SqlParser._
import play.api.db._
import play.api.Play.current
import play.api.libs.json.{Json,JsValue}
case class Sample (
id: Pk[Long] = NotAssigned,
data: JsValue
)
object Sample {
val extractor = {
get[Pk[Long]]("sample.id") ~
get[JsValue]("sample.data") map {
case id~data => Sample(id, data)
}
implicit def rowToJsValue: Column[JsValue] = Column.nonNull { (value, meta) =>
val MetaDataItem(qualified, nullable, clazz) = meta
value match {
case pgo: org.postgresql.util.PGobject => Right(Json.parse(pgo.getValue))
case _ => Left(TypeDoesNotMatch("Cannot convert " + value + ":" +
value.asInstanceOf[AnyRef].getClass + " to JsValue for column " + qualified))
}
}
def list() = {
DB.withConnection { implicit connection =>
SQL(
"""
SELECT
*
FROM
sample
"""
).as(Sample.extractor *)
}
}
def create(data: JsValue) = {
DB.withConnection { implicit connection =>
SQL(
"""
INSERT INTO
sample
(
data
)
VALUES(
CAST({data} AS json)
)
RETURNING id
"""
).on(
'data -> Json.stringify(data)
).as(scalar[Long].single)
}
}
}
This approach is inspired by and adapted from
a StackOverflow question.
No comments:
Post a Comment