首页 > 解决方案 > 如何调用并等待从 Python 并行运行的 1,000 个 AWS Lambda?

问题描述

当我使用第三方 aiobotocore 时,它​​最多可以工作 NUM_WORKERS=500,如果我想达到 1000,我会收到此错误:

    r, w, _ = self._select(self._readers, self._writers, [], timeout)
  File ".....\lib\selectors.py", line 314, in _select
    r, w, x = select.select(r, w, w, timeout)
ValueError: too many file descriptors in select()

如果有办法并行执行 1000?

资源:


import os, sys, time, json
import asyncio
from itertools import chain
from typing import List
import logging
from functools import partial
from pprint import pprint 



# Third Party
import asyncpool
import aiobotocore.session
import aiobotocore.config

_NUM_WORKERS=500

async def execute_lambda( lambda_name: str, key: str, client):
    # Get json content from s3 object
    if 1:
        name=lambda_name
        response = await client.invoke(
            InvocationType='RequestResponse',
            FunctionName=name,
            LogType='Tail',
            Payload=json.dumps({
                'exec_id':key,
                })
            )
    out=[]
    async for event in response['Payload']:
        out.append(event.decode())

    #await asyncio.sleep(1)
    return out


async def submit(lambda_name: str) -> List[dict]:
    """
    Returns list of AWS Lambda outputs executed in parallel

    :param name: name of lambda function
    :return: list of lambda returns
    """
    logging.basicConfig(level=logging.INFO)
    logger = logging.getLogger()

    session = aiobotocore.session.AioSession()
    config = aiobotocore.config.AioConfig(max_pool_connections=_NUM_WORKERS)
    contents = []
    #client = boto3.client('lambda', region_name='us-west-2')
    async with session.create_client('lambda', region_name='us-west-2', config=config) as client:
        worker_co = partial(execute_lambda, lambda_name)
        async with asyncpool.AsyncPool(None, _NUM_WORKERS, 'lambda_work_queue', logger, worker_co,
                                       return_futures=True, raise_on_join=True, log_every_n=10) as work_pool:
            for x in range(_NUM_WORKERS):
                contents.append(await work_pool.push(x, client))

    # retrieve results from futures
    contents = [c.result() for c in contents]
    return list(chain.from_iterable(contents))



def main(name, files):
    s = time.perf_counter()
    _loop = asyncio.get_event_loop()
    _result = _loop.run_until_complete(submit(name))
    pprint(_result)
    elapsed = time.perf_counter() - s
    print(f"{__file__} executed in {elapsed:0.2f} seconds.")

拉姆达函数:

import time
def lambda_handler(event, context):
    time.sleep(10)
    return {'code':0, 'exec_id':event['exec_id']}

结果:

 '{"code": 0, "exec_id": 0}',
 '{"code": 0, "exec_id": 1}',
 '{"code": 0, "exec_id": 2}',
 '{"code": 0, "exec_id": 3}',
...
 '{"code": 0, "exec_id": 496}',
 '{"code": 0, "exec_id": 497}',
 '{"code": 0, "exec_id": 498}',
 '{"code": 0, "exec_id": 499}']
my_cli_script.py executed in 14.56 seconds.

标签: python-3.xamazon-web-servicesaws-lambdaboto3python-asyncio

解决方案


针对此处评论中提出的问题,这是我用来并行启动 100 个 lambda 实例的代码:


import boto3
import json
from concurrent.futures import ThreadPoolExecutor

# AWS credentials are exported in my env variables
# so region and account-id are fetched from there
lambda_ = boto3.client('lambda')

def invoke_lambda(payload):
    payload = {'body': json.dumps(payload)}

    response = lambda_.invoke(
        FunctionName='my-func',
        # I need to receive a response back from lambda
        # so I use sync invocation
        InvocationType='RequestResponse',
        LogType='Tail',
        Payload=json.dumps(payload)
    )

    res_payload = response.get('Payload').read()
    body = json.loads(res_payload).get('body')
    
    return body


MAX_WORKERS = 100  # how many lambdas you want to spin up concurrently

with ThreadPoolExecutor(max_workers=MAX_WORKERS) as executor:
    result = list(executor.map(invoke_lambda, data))
# data is a list of dicts, each element is a single "payload"

最后两点:

  1. 十几毫秒产生 100 个并发 lambda 可能有点夸张。出于某种原因,如果我在 cloudwatch 指标中设置更高的粒度,它不会绘制任何内容,所以我不能确定它到底花了多长时间。为了安全线程,我会在 2 秒内说。
  2. 到目前为止,这段代码只在我的本地环境中运行过。它非常普通,所以我不明白为什么它不能在其他地方工作(例如,另一个父 lambda),但作为警告,我还没有在线测试它。

推荐阅读