首页 > 解决方案 > 来自 Databricks 的 Azure 日志分析数据收集器休息 API 连接超时错误

问题描述

我正在尝试使用Microsoft 教程从 Databricks 笔记本向日志分析发送一些自定义日志,但是我遇到了 rest API 连接超时错误。

ConnectionError: HTTPSConnectionPool(host='XXXXXXX-XXXX-XXXX-XXXX-XXXXXXXX.ods.opinsights.azure.com', port=443): Max retries exceeded with url: /api/logs?api-version=2016-04-01 (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7fbed9108310>: Failed to establish a new connection: [Errno 110] Connection timed out'))

请问有什么建议吗?如何允许 Azure Databricks 访问日志分析工作区?

标签: azureazure-databricksazure-log-analytics

解决方案


请在下面找到有关如何使用 Azure Monitor HTTP 数据收集器 API 提交自定义数据日志的示例 Python 脚本。

import json
import requests
import datetime
import hashlib
import hmac
import base64

#Retrieve your Log Analytics Workspace ID from your Key Vault Databricks Secret Scope
wks_id = dbutils.secrets.get(scope = "keyvault_scope", key = "wks-id-logaw1")

#Retrieve your Log Analytics Primary Key from your Key Vault Databricks Secret Scope
wks_shared_key = dbutils.secrets.get(scope = "keyvault_scope", key = "wks-shared-key-logaw1")

#The log type is the name of the event that is being submitted
log_type = 'WebMonitorTest'

#An example JSON web monitor object
json_data = [{
  "slot_ID": 12345,
  "ID": "5cdad72f-c848-4df0-8aaa-ffe033e75d57",
  "availability_Value": 100,
  "performance_Value": 6.954,
  "measurement_Name": "last_one_hour",
  "duration": 3600,
  "warning_Threshold": 0,
  "critical_Threshold": 0,
  "IsActive": "true"
},
{
  "slot_ID": 67890,
  "ID": "b6bee458-fb65-492e-996d-61c4d7fbb942",
  "availability_Value": 100,
  "performance_Value": 3.379,
  "measurement_Name": "last_one_hour",
  "duration": 3600,
  "warning_Threshold": 0,
  "critical_Threshold": 0,
  "IsActive": "false"
}]
body = json.dumps(json_data)

#####################
######Functions######
#####################

#Build the API signature
def build_signature(customer_id, shared_key, date, content_length, method, content_type, resource):
  x_headers = 'x-ms-date:' + date
  string_to_hash = method + "\n" + str(content_length) + "\n" + content_type + "\n" + x_headers + "\n" + resource
  bytes_to_hash = str.encode(string_to_hash,'utf-8')  
  decoded_key = base64.b64decode(shared_key)
  encoded_hash = (base64.b64encode(hmac.new(decoded_key, bytes_to_hash, digestmod=hashlib.sha256).digest())).decode()
  authorization = "SharedKey {}:{}".format(customer_id,encoded_hash)
  return authorization

#Build and send a request to the POST API
def post_data(customer_id, shared_key, body, log_type):
  method = 'POST'
  content_type = 'application/json'
  resource = '/api/logs'
  rfc1123date = datetime.datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')
  content_length = len(body)
  signature = build_signature(customer_id, shared_key, rfc1123date, content_length, method, content_type, resource)
  uri = 'https://' + customer_id + '.ods.opinsights.azure.com' + resource + '?api-version=2016-04-01'

  headers = {
      'content-type': content_type,
      'Authorization': signature,
      'Log-Type': log_type,
      'x-ms-date': rfc1123date
  }

  response = requests.post(uri,data=body, headers=headers)
  if (response.status_code >= 200 and response.status_code <= 299):
      print ('Accepted')
  else:
      print ("Response code: {}".format(response.status_code))
      
#Post the log
post_data(wks_id, wks_shared_key, body, log_type)

本文有助于通过 Azure 上的 Databricks 在 Log Analytics 上编写自定义日志


推荐阅读