2.1.8版本
首先开启sql级别的内存限制,通过
#enable_query_memory_overcommit = false
然后
set global exec_mem_limit = 274877906944;
重启BE、重启FE、重启集群、重启flink、spark导入任务都会报错:
type:load, limit 2.00 GB, peak used 1.88 GB, current used 1.88 GB. backend 10.72.121.8,
并且spark-connector-doris通过添加
"doris.exec.mem.limit" = "687194767360"
还是无效
求解决