转换前
转换后
可以用conv函数(具体用法可以搜‘ conv spark ’),使用示例
conv(substr(data,3,64),16,10)
可能是数据太大出现了数据溢出.有一个临时的解决方案,将大的hex数据拆分为多个部分分别转码
with cte as (
select lpad('0000000000000000000000000000000000000000000000014d1120d7b1600000', 64, '0') as n,
'4294967296' as exp0,
-- 16^8,
'72057594037927936' as exp1 -- 16^14,
)
select
CAST(conv(substring(n, 1, 14), 16, 10) AS STRING) * exp0 * exp1 * exp1 * exp1
+ CAST(conv(substring(n, 15, 14), 16, 10) AS STRING) * exp0 * exp1 * exp1
+ CAST(conv(substring(n, 29, 14), 16, 10) AS STRING) * exp0 * exp1
+ CAST(conv(substring(n, 43, 14), 16, 10) AS STRING) * exp0 + CAST(conv(substring(n, 57, 14), 16, 10) AS STRING)
from cte;
使用函数bytea2numeric()
bep20.xxx是Dune V1版本的计算引擎用(PostgreSQL database) 是Dune V2版本的计算引擎(spark + databricks)