[JPA_SPEC-73] Parameterized AttributeConverter and/or AttributeConverter metadata access Created: 07/Feb/14 Updated: 23/Nov/15
|Remaining Estimate:||Not Specified|
|Time Spent:||Not Specified|
|Original Estimate:||Not Specified|
It would be good if we were able to parameterize AttributeConverter instances and/or have access to property metadata within an AttributeConverter. This way we are able to reuse AttributeConverter code instead of building similar ones or even supply further required context information. An example:
Quite often we have to deal with ancient DB schemas, and just recently we started to migrate a large Cobol application to Java. We are unable to change the content of the database, and because of Cobol with have to support fixed width string columns containing left padded data. A 5 digit fixed width column will represent 1 as '00001' and 100 as '00100'. Nothing we can change here. Unless I am mistaken I have to create a dedicated converter for every required fixed length ending up with an LeadingZeroTwoDigitAttributeConverter, LeadingZeroFiveDigitAttributeConverter and so on.
Right now it seems the JPA spec does not define wether there is one converter instance per persistent property or just a global one (which would be OK by current spec requirements). I would propose that parameterized converter instances are bound to property fields in order to support parameter evaluation during JPA provider startup.
I believe we need access to both basic attribute information and optionally supplied converter attributes. The following example would assume access to the leading pad char, the length attribute and wether the attribute is nullable or not. (in my current project I have to store an empty string for null values in the database).
@Convert(converter = LeadingDigitAttributeConverter.class, metaData="padChar=0" )
If you believe there is no need to support something like that: Unfortunately we are in the middle of migrating quite a few very very old applications to Java, and we can't change that stuff. And there is more to come.
|Comment by c.beikov [ 15/Apr/14 ]|
I agree that metadata is needed but I guess it would be easier to just let the converter instance know about the metamodel Attribute instance or something simialar.
|Comment by frenchc [ 15/Apr/14 ]|
Sounds good to me.
|Comment by tomdcc [ 30/May/14 ]|
We have need of this as well - we'd like to convert enum attributes to specific string representations in the database, and with the current spec we have to create a converter per enum, rather than e.g. having the enum classes implement an interface to make the required string available and use a single converter.
Hibernate has parameters that you can pass to their converter types which is a workaround for this, but we're not using Hibernate for this project and in any case it's pretty ugly to have to do that for every column.
Making the metamodel attribute available to the converter would be perfect, as it could then grab the attribute type. The nice thing about that approach, too, is that if someone wants to pass extra info in to the converter that isn't available in the normal JPA model, they can create a custom annotation and the converter can call attribute.getJavaMember() and look for annotations, and the info is right with all the other metadata for the attribute.
|Comment by isk0001y [ 23/Dec/14 ]|
Such a parameterized AttributeConverter may also be of help when one is creating converter for hundreds of enums.
I can use the AttributeConverter to persist any enum implementing that interface, by just calling "getFoo()"; assuming getFoo() will return a basic type like string, the direction to the database is problem-free.
Both eclipselink and hibernate provide have solutions for this. Eclipselink allows me to use their "Converter" infrastructure to create Converters with a special "initialize" method that allows me to access the entity property being converted. Hibernate allows me to create "UserDefinedTypes" like "UserType", where the @Type annotation takes an array of @Parameters to configure the converter. Both techniques result in me creating only ONE converter, but any entity class is then dependent on the concrete JPA provider through imports. This cannot be what you guys want
If I already must stick to one JPA provider, then I completely can skip using jpa at all, and stay incompatible.
|Comment by uk.wildcat [ 28/Aug/15 ]|
For the slow, can someone provide an example of the workaround that c.beikov is outlining here
|Comment by c.beikov [ 28/Aug/15 ]|
You just define your own annotation type like
and use it on your field or getter
then when you have access to the javax.persistence.metamodel.Attribute you can get access to the member and via reflection access the MyAnnotation instance.
|Comment by pbenedict [ 23/Nov/15 ]|
I think this is easily possible with some design enhancement. Food for thought:
1) Once converters are injectable (JPA_SPEC-109), that means they will have a controllable lifecycle. Converters that are parameterized obviously cannot be singletons because they require customization per instance.
2) Enhance @Converter to allow an array of parameterized key/value pairs. The key represents a setter method on the converter instance.
3) By new rule of the spec, any converter that has "parameters" are non-singletons.