首页 > 解决方案 > Serverless 为 S3 触发器部署第二个功能

问题描述

我创建了 AWS Lambda 函数,以便在特定路径的 S3 中创建新文件时运行,效果很好。

service: redshift

frameworkVersion: '2'

custom:
  bucket: extapp
  path_prefix: 'xyz'
  database: ABC
  schema: xyz_dbo
  table_prefix: shipmentlog
  user: admin
  password: "#$%^&*(*&^%$%"
  port: 5439
  endpoint: "*********.redshift.amazonaws.com"
  role: "arn:aws:iam::*****:role/RedshiftFileTransfer"

provider:
  name: aws
  runtime: python3.8
  stage: prod
  region: us-west-2
  stackName: redshift-prod-copy
  stackTags:
    Service: "it"
  lambdaHashingVersion: 20201221
  memorySize: 128
  timeout: 900
  logRetentionInDays: 14
  environment:
    S3_BUCKET: ${self:custom.bucket}
    S3_BUCKET_PATH_PREFIX: ${self:custom.path_prefix}
    REDSHIFT_DATABASE: ${self:custom.database}
    REDSHIFT_SCHEMA: ${self:custom.schema}
    REDSHIFT_TABEL_PREFIX: ${self:custom.table_prefix}
    REDSHIFT_USER: ${self:custom.user}
    REDSHIFT_PASSWORD: ${self:custom.password}
    REDSHIFT_PORT: ${self:custom.port}
    REDSHIFT_ENDPOINT: ${self:custom.endpoint}
    REDSHIFT_ROLE: ${self:custom.role}
  iam:
    role:
      name: s3-to-redshift-copy
      statements:
        - Effect: Allow
          Action:
            - s3:GetObject
          Resource: "arn:aws:s3:::${self:custom.bucket}/*"

functions:
  copy:
    handler: handler.run
    events:
      - s3:
          bucket: ${self:custom.bucket}
          event: s3:ObjectCreated:*
          rules:
            - prefix: ${self:custom.path_prefix}/
            - suffix: .json
          existing: true

package:
  exclude:
    - node_modules/**
    - package*.json
    - README.md

plugins:
  - serverless-python-requirements

但是当我部署这个函数时,还有另一个函数被部署,名称redshift-prod-custom-resource-existing-s3Node.js函数。我想了解为什么在特定路径的 S3 存储桶中创建新文件时触发主 lambda 函数需要第二个函数。

标签: amazon-web-servicesaws-lambdaserverless-frameworkserverlessaws-serverless

解决方案



推荐阅读