首页 > 解决方案 > 来自数据流作业/计算引擎的 GCP CloudSql 连接限制

问题描述

我有一个连接到 cloudsql 并保存一些数据的数据流作业。平均而言,我有大约 75 个活动连接(偶尔会达到 100 多个连接)。因此,我想知道是否有最大连接数。文档似乎没有说明。( https://cloud.google.com/sql/docs/mysql/connect-admin-ip )

背景故事和某些背景:我的一项工作出现错误,它似乎只是随机锁定并停止持久数据:

Operation ongoing in step X for at least 305h20m00s without outputting or completing in state start
at sun.misc.Unsafe.park (Native Method)
at java.util.concurrent.locks.LockSupport.park (LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await (AbstractQueuedSynchronizer.java:2039)
at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst (LinkedBlockingDeque.java:590)
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject (GenericObjectPool.java:425)
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject (GenericObjectPool.java:346)
at org.apache.commons.dbcp2.PoolingDataSource.getConnection (PoolingDataSource.java:134)
at org.apache.commons.dbcp2.BasicDataSource.getConnection (BasicDataSource.java:809)
at org.apache.commons.dbcp2.DataSourceConnectionFactory.createConnection (DataSourceConnectionFactory.java:83)
at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject (PoolableConnectionFactory.java:355)
at org.apache.commons.pool2.impl.GenericObjectPool.create (GenericObjectPool.java:874)
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject (GenericObjectPool.java:417)
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject (GenericObjectPool.java:346)
at org.apache.commons.dbcp2.PoolingDataSource.getConnection (PoolingDataSource.java:134)
at x.io.jobs.common.mysql.function.MySqlReadAllFn.setup (MySqlReadAllFn.java:57)
at x.io.jobs.tracer.function.ReadAggTraceStatusByIdFn$DoFnInvoker.invokeSetup (Unknown Source)
at org.apache.beam.runners.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy (DoFnInstanceManagers.java:83)
at org.apache.beam.runners.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.get (DoFnInstanceManagers.java:75)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.reallyStartBundle (SimpleParDoFn.java:296)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement (SimpleParDoFn.java:326)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process (ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process (OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowsParDoFn$1.output (GroupAlsoByWindowsParDoFn.java:185)
at org.apache.beam.runners.dataflow.worker.GroupAlsoByWindowFnRunner$1.outputWindowedValue (GroupAlsoByWindowFnRunner.java:108)
at org.apache.beam.runners.dataflow.worker.repackaged.org.apache.beam.runners.core.ReduceFnRunner.lambda$onTrigger$1 (ReduceFnRunner.java:1060)

谢谢。

标签: google-cloud-platformgoogle-cloud-dataflowapache-beamgoogle-cloud-sql

解决方案


Cloud SQL 有一个连接限制,可以通过max_connections在实例上设置标志来更改。有关在实例上设置和查看数据库标志值的更多信息:https ://cloud.google.com/sql/docs/mysql/flags


推荐阅读