Parsing Datatypes

Post Reply
Steffenadensis
Posts: 31
Joined: Tue Jul 29, 2014 11:42 am

Parsing Datatypes

Post by Steffenadensis » Fri Dec 05, 2014 2:20 pm

Hello,

i just realised, that if i use the MAP Operator like below in Code_one, the Database - jdbc throws:

java.sql.BatchUpdateException: Batch-Eintrag 0 INSERT INTO head_master_sek VALUES('EBU_DE_ALD02_01_01_s','519','0.468') wurde abgebrochen. Rufen Sie 'getNextException' auf, um die Ursache zu erfahren.

if i change the MAP Operator like Code_two, it do work.

My Question is, will there be an automation about that, in the Databasesink relativ to the Tableschema, the convert shout be possible?

Code: Select all

/// Code_one

#PARSER PQL
#RUNQUERY
accessp4000 = ACCESS({source='accessp4000',
	wrapper='GenericPush',
	transport='TcpServer',
	protocol='line',
	dataHandler='Tuple',
	options=[['port', '4000']],
	schema=[['line','String']]
})

container = ROUTE({
	predicates = [
		'startsWith(line,"EBU_DE_ALD02_01_01_s")', 
		'startsWith(line,"EBU_DE_ALD02_01_01_m")', 
		'startsWith(line,"EBU_DE_ALD02_02_01_s")', 
		'startsWith(line,"EBU_DE_ALD02_02_01_m")'
	]
}, accessp4000)

#LOOP i 0 UPTO 3
	container${i} = RENAME({isNoOp = 'true'}, ${i}:container)
#ENDLOOP

pre_master_s = MAP({
	expressions = [
		['Split(line,"/")','line']
	]
}, container0)

master_head_s = MAP({
	expressions = [
		['line[0]','HEAD_GEN_StreamID'],
		['line[1]','HEAD_CALC_TimeStamp'],
		['line[2]','HEAD_GEN_NumberVersion']
	]
}, pre_master_s)

head_master_sek = DATABASESINK({ 
	connection='db_head_master_sek',
	table='head_master_sek',
	drop='true',
	Tableschema=[
		'varchar(30)',
		'integer',
		'real'
	]
}, master_head_s)

Code: Select all

/// Code_two

master_head_s = MAP({
	expressions = [
		['line[0]','HEAD_GEN_StreamID'],
		['ToLong(line[1])','HEAD_CALC_TimeStamp'],
		['ToDouble(line[2])','HEAD_GEN_NumberVersion']
	]
}, pre_master_s)
Kind regards

T. Steffen

Operating system: Debian GNU/Linux 7.6 (wheezy)

User avatar
Marco Grawunder
Posts: 272
Joined: Tue Jul 29, 2014 10:29 am
Location: Oldenburg, Germany
Contact:

Re: Parsing Datatypes

Post by Marco Grawunder » Fri Dec 05, 2014 3:18 pm

I think the second option is the better one, because the type check can be done before sending the data to the database. Especially, there could be some processing after the Map operator...

faarigia
Posts: 1
Joined: Mon Feb 09, 2015 8:52 am

Re: Parsing Datatypes

Post by faarigia » Mon Feb 09, 2015 9:25 am

Only when there is a field that has this information, STARTTIMESTAMP and ENDTIMESTAMP can be used.

Post Reply

Who is online

Users browsing this forum: No registered users and 1 guest