amazon-web-services - 使用 AWS CDK 时 Lambda 函数无法实例化
问题描述
我正在尝试使用 AWS CDK 部署将由 S3 上传事件触发的 Lambda 函数。当我尝试做cdk ls
orcdk synth
时,我收到错误:
Traceback (most recent call last):
File "app.py", line 14, in <module>
S3TosqsStack(app, "S3TosqsStack", env=core.Environment(account=os.getenv('CDK_DEFAULT_ACCOUNT'), region=os.getenv('CDK_DEFAULT_REGION')))
File "/home/ec2-user/s3tosqs/.venv/lib64/python3.7/site-packages/jsii/_runtime.py", line 83, in __call__
inst = super().__call__(*args, **kwargs)
File "/home/ec2-user/s3tosqs/s3tosqs/s3tosqs_stack.py", line 37, in __init__
bucket=s3.IBucket(bucket_name=lambda_deployment_bucket),
File "/home/ec2-user/s3tosqs/.venv/lib64/python3.7/site-packages/typing_extensions.py", line 1548, in _no_init
raise TypeError('Protocols cannot be instantiated')
TypeError: Protocols cannot be instantiated
Subprocess exited with error 1
看起来问题源于bucket
lambda 函数的定义,但我不明白错误是什么,因为我遵循了文档。我尝试使用bucket_arn
而不是bucket_name
,但这也不起作用。
这是主要的堆栈代码:
s3tosqs_stack.py
from aws_cdk import (
aws_s3 as s3,
aws_s3_notifications as s3_notifications,
aws_sqs as sqs,
aws_lambda as _lambda
)
from aws_cdk import core
# User-specified Parameters
lambda_deployment_bucket = 'some-deployment-bucket'
trigger_bucket = 'some-trigger-bucket'
trigger_key = 'uploads'
queue_name = 'some-queue.fifo'
region = 'us-west-2'
class S3TosqsStack(core.Stack):
def __init__(self, scope: core.Construct, construct_id: str, **kwargs) -> None:
super().__init__(scope, construct_id, **kwargs)
# Defines an SQS queue resource
queue = sqs.Queue(
self, 'NotificationQueue',
queue_name=queue_name,
content_based_deduplication=True,
visibility_timeout=core.Duration.seconds(300)
)
# Defines an AWS Lambda resource
lambda_function = _lambda.Function(
self, 'S3toSQS',
runtime=_lambda.Runtime.PYTHON_3_8,
code=_lambda.Code.from_bucket(
bucket=s3.IBucket(bucket_name=lambda_deployment_bucket),
key='S3toSQS.zip'),
handler='handler.publish_SQS_message',
environment={'SOURCE_BUCKET': trigger_bucket,
'REGION': region,
'QUEUE_NAME': queue_name}
)
# Define S3 upload bucket for Lambda trigger
upload_bucket = s3.Bucket(
self, 'S3TriggerBucket',
bucket_name=trigger_bucket
)
upload_bucket.add_event_notification(
s3.EventType.OBJECT_CREATED,
s3_notifications.LambdaDestination(lambda_function),
s3.NotificationKeyFilter(
prefix=trigger_key)
)
解决方案
您是否尝试过使用 Bucket 和 from_bucket_name 使用存储桶引用?
lambda_function = _lambda.Function(
self, 'S3toSQS',
runtime=_lambda.Runtime.PYTHON_3_8,
code=_lambda.Code.from_bucket(
bucket=s3.Bucket.from_bucket_name(self, "id",bucket_name=lambda_deployment_bucket),
key='S3toSQS.zip'),
handler='handler.publish_SQS_message',
environment={'SOURCE_BUCKET': trigger_bucket,
'REGION': region,
'QUEUE_NAME': queue.queue_name}
)
推荐阅读
- android - 如何在音乐播放器android中实现黑名单文件夹
- python - Python - Pandas 数据框表/列结构
- amazon-web-services - “无法与 AWS CodeCommit 协商”:“找不到匹配的主机密钥类型”
- javascript - 为什么我的消息收集器收集相同的消息?
- postgresql - 如何从 jsonb 中删除所有值为 null 的元素?
- c++ - 在(排序的)循环数据结构中搜索值的快速方法
- c - 错误的文件描述符:accept() 错误(套接字)
- r - 使用 tidymodels 计算所有类别的预测值
- vue.js - Bootstrap Vue b-table滚动条不可见?
- azure - SignalR 长轮询重复调用 /negotiate 和 /hub POST 并偶尔在 Azure Web App 上返回 404