首页 > 解决方案 > Google Cloud 数据流和 Cloud Functions 错误 - ModuleNotFoundError

问题描述

我正在从 GCP 中的云函数触发数据流作业。

Cloud函数中嵌入的代码

import apache_beam as beam
import argparse

PROJECT = 'projectName'
BUCKET='bucketName'
SCHEMA = 'sr:INTEGER,abv:FLOAT,id:INTEGER,name:STRING,style:STRING,ounces:FLOAT'
DATAFLOW_JOB_NAME = 'jobName'

def discard_incomplete(data):
    """Filters out records that don't have an information."""
    return len(data['abv']) > 0 and len(data['id']) > 0 and len(data['name']) > 0 and len(data['style']) > 0


def convert_types(data):
    """Converts string values to their appropriate type."""
    data['abv'] = float(data['abv']) if 'abv' in data else None
    data['id'] = int(data['id']) if 'id' in data else None
    data['name'] = str(data['name']) if 'name' in data else None
    data['style'] = str(data['style']) if 'style' in data else None
    data['ounces'] = float(data['ounces']) if 'ounces' in data else None
    return data

def del_unwanted_cols(data):
    """Delete the unwanted columns"""
    del data['ibu']
    del data['brewery_id']
    return data

def execute(event, context):
    argv = [
      '--project={0}'.format(PROJECT),
      '--job_name={0}'.format(DATAFLOW_JOB_NAME),
      '--staging_location=gs://{0}/staging/'.format(BUCKET),
      '--temp_location=gs://{0}/staging/'.format(BUCKET),
      '--region=us-central1',
      '--runner=DataflowRunner'
   ]

    p = beam.Pipeline(argv=argv)
    input = 'gs://{0}/beers.csv'.format(BUCKET)

    (p | 'ReadData' >> beam.io.ReadFromText(input, skip_header_lines =1)
       | 'SplitData' >> beam.Map(lambda x: x.split(','))
       | 'FormatToDict' >> beam.Map(lambda x: {"sr": x[0], "abv": x[1], "ibu": x[2], "id": x[3], "name": x[4], "style": x[5], "brewery_id": x[6], "ounces": x[7]}) 
       | 'DeleteIncompleteData' >> beam.Filter(discard_incomplete)
       | 'ChangeDataType' >> beam.Map(convert_types)
       | 'DeleteUnwantedData' >> beam.Map(del_unwanted_cols)
       | 'WriteToBigQuery' >> beam.io.WriteToBigQuery(
           '{0}:sandeep_beer_test.beer_data'.format(PROJECT),
           schema=SCHEMA,
           write_disposition=beam.io.BigQueryDisposition.WRITE_APPEND))
    p.run()
    

执行云功能但作业不断失败时,确实会触发数据流作业。当我检查作业日志时,我看到此错误消息 -ModuleNotFoundError: No module named 'google.cloud.functions'

要求.txt

apache-beam[gcp]

如果我在安装 apache-beam[gcp] 后直接从云 shell 运行它,嵌入在云函数中的 python 代码可以正常工作。

请分享您对如何克服缺少模块的数据流错误的意见。

谢谢

桑迪普

标签: google-cloud-platformgoogle-cloud-functionsapache-beam

解决方案


这很可能是因为您正在通过--save_main_session.


推荐阅读