google-cloud-platform - 数据流作业的值为 temp_location "gs://dataflow-staging-us-east1-657091687220/temp" 而不是我设置的
问题描述
我创建了这样的模板:
python main.py --project PROJECT_NAME--staging_location gs://BUCKETNAME/dataflow/staging --temp_location gs://BUCKETNAME/temp --template_location gs://BUCKETNAME/dataflow/templates/JOBNAME --machine_type n1-custom-2-13312 --runner DataflowRunner --max_num_workers 36 --no_use_public_ips --job_name JOBNAME --region us-east1
然后我像这样启动它:
gcloud dataflow jobs run DFLOWJOBNAME --gcs-location gs://BUCKETNAME/dataflow/templates/qtest --parameters INPUT_AND_OUTPUTFILESPARAMS --region us-east1
该作业运行正常,直到结束,然后它无法读取这个神秘的 gs://dataflow-staging-us-east1-657091687220/temp 中的一些 avro 文件。
在这个问题的最后是输出,gcloud dataflow jobs show JOB_ID --environment
我可以看到标题中有 2 个字段包含未知的 gcp 存储桶,即:“tempStoragePrefix”和“gcpTempLocation”。
我真的不在乎 tmp 文件夹,我只是想让工作完成
creationTime: '2020-01-28 13:00:19'
environment:
clusterManagerApiService: instancegroup.googleapis.com
dataset: bigquery.googleapis.com/cloud_dataflow
experiments:
- emit_autoscaling_rationales
- emit_autoscaling_monitoring_events
- enable_billing_v_1_5
- use_cloud_gaia
- enable_dataflow_service_account
- enable_dataprep_new_billing
- delayed_launch
- enable_component_new_persistence_format_wave4
- enable_throttled_based_rescaling
- use_grpc_shuffle_appliance_transport
- use_replica_pools
- limit_preemptible_worker_pct
- limit_resizing_by_cpu_util
- override_controller_service_account
- use_shuffle_service_dynamic_repartitioning
- shuffle_service_repartition_hotkey_detection_fraction=0.8
- enable_shuffle_service_new_billing
- enable_shuffle_service_throttled_proxy_based_scaling
- use_dataflow_service_account_in_igm
- use_fixed_costs_in_resizing
- use_gci_image
- use_host_networking
- use_multi_hop_delegation
- use_new_tmp_filename_format
- use_process_pool_config
- use_work_manager_v2
- use_worker_zone_chooser_by_default
- worker-translation
- use_fastavro
sdkPipelineOptions:
display_data:
- key: runner
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: DataflowRunner
- key: project
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: PROJECT_NAME
- key: job_name
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: beamapp-lorenzo-0128125622-520176
- key: staging_location
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: gs://MYBUCKET/dataflow/staging/beamapp-lorenzo-0128125622-520176.1580216182.520259
- key: temp_location
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: gs://TMPBUCKET/temp/beamapp-lorenzo-0128125622-520176.1580216182.520259
- key: region
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: us-east1
- key: template_location
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: gs://MYBUCKET/dataflow/templates/JOBNAME
- key: max_num_workers
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: INTEGER
value: 36
- key: machine_type
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: n1-custom-2-13312
- key: use_public_ips
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: BOOLEAN
value: false
- key: experiments
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: "['use_fastavro']"
- key: beam_plugins
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: "['apache_beam.io.filesystem.FileSystem', 'apache_beam.io.hadoopfilesystem.HadoopFileSystem',\
\ 'apache_beam.io.localfilesystem.LocalFileSystem', 'apache_beam.io.gcp.gcsfilesystem.GCSFileSystem']"
- key: save_main_session
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: BOOLEAN
value: true
- key: input
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: 'RuntimeValueProvider(option: input, type: str, default_value: None)'
- key: output
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: 'RuntimeValueProvider(option: output, type: str, default_value: None)'
- key: data4
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: 'RuntimeValueProvider(option: data4, type: str, default_value: None)'
- key: data3
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: 'RuntimeValueProvider(option: data3, type: str, default_value:
None)'
- key: data2
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: 'RuntimeValueProvider(option: data2, type: str, default_value: None)'
- key: data1
namespace: apache_beam.options.pipeline_options.PipelineOptions
type: STRING
value: 'RuntimeValueProvider(option: data1, type: str, default_value:
None)'
- key: templateLocation
namespace: google.dataflow.v1beta3.TemplatesService
type: STRING
value: gs://MYBUCKET/dataflow/templates/JOBNAME
options:
data3: null
data2: null
beam_plugins:
- apache_beam.io.filesystem.FileSystem
- apache_beam.io.hadoopfilesystem.HadoopFileSystem
- apache_beam.io.localfilesystem.LocalFileSystem
- apache_beam.io.gcp.gcsfilesystem.GCSFileSystem
dataflow_endpoint: https://dataflow.googleapis.com
direct_num_workers: 1
direct_runner_bundle_repeat: 0
direct_runner_use_stacked_bundle: true
dry_run: false
enable_streaming_engine: false
environment_cache_millis: 0
experiments:
- use_fastavro
input: null
job_name: beamapp-lorenzo-0128125622-520176
machine_type: n1-custom-2-13312
data1: null
max_num_workers: 36
no_auth: false
output: null
pipelineUrl: gs://MYBUCKET/dataflow/staging/beamapp-lorenzo-0128125622-520176.1580216182.520259/pipeline.pb
pipeline_type_check: true
profile_cpu: false
profile_memory: false
profile_sample_rate: 1
project: PROJECT_NAME
data4: null
region: us-east1
runner: DataflowRunner
runtime_type_check: false
save_main_session: true
sdk_location: default
sdk_worker_parallelism: 0
staging_location: gs://MYBUCKET/dataflow/staging/beamapp-lorenzo-0128125622-520176.1580216182.520259
streaming: false
temp_location: gs://TMPBUCKET/temp/beamapp-lorenzo-0128125622-520176.1580216182.520259
templateLocation: gs://MYBUCKET/dataflow/templates/JOBNAME
template_location: gs://MYBUCKET/dataflow/templates/JOBNAME
type_check_strictness: DEFAULT_TO_ANY
update: false
use_public_ips: false
tempStoragePrefix: storage.googleapis.com/dataflow-staging-us-east1-657091687220/temp
userAgent:
name: Apache Beam Python 3.6 SDK
version: 2.16.0
version:
job_type: PYTHON_BATCH
major: '7'
workerPools:
- autoscalingSettings:
algorithm: AUTOSCALING_ALGORITHM_BASIC
maxNumWorkers: 36
diskSizeGb: 250
diskSourceImage: https://www.googleapis.com/compute/v1/projects/dataflow-service-producer-prod/global/images/dataflow-dataflow-owned-resource-20200109-14-rc00
diskType: compute.googleapis.com/projects//zones//disks/pd-standard
ipConfiguration: WORKER_IP_PRIVATE
kind: harness
machineType: n1-custom-2-13312
metadata:
cloud_region: us-east1
cos-update-strategy: update_disabled
dataflow_api_endpoint: https://dataflow.googleapis.com/
enable_jvm_metrics: 'false'
google-container-manifest: |
{
"apiVersion": "v1",
"kind": "Pod",
"metadata": {
"name": "dataflow"
},
"spec": {
"containers": [ {
"args": [ "--log_file=/var/log/dataflow/boot-json.log", "--log_dir=/var/log/dataflow", "--work_dir=/var/opt/google/dataflow", "--tmp_dir=/var/opt/google/tmp", "--endpoint=https://dataflow.googleapis.com/" ],
"image": "gcr.io/cloud-dataflow/v1beta3/python36:2.16.0",
"imagePullPolicy": "IfNotPresent",
"name": "python",
"volumeMounts": [ {
"mountPath": "/opt/google/dataflow/libshuffle_v1.so",
"name": "shuffle-lib-v1-python"
}, {
"mountPath": "/opt/google/dataflow/libshuffle_v1_py3.so",
"name": "shuffle-lib-v1-python3"
}, {
"mountPath": "/var/opt/google",
"name": "persist"
}, {
"mountPath": "/var/log/dataflow",
"name": "dataflowlogs-harness"
} ]
}, {
"args": [ "--log_file=/var/log/dataflow/boot-json.log", "--log_dir=/var/log/dataflow", "--physmem_limit_pct=30", "--sorter_size=134217728", "--port=12345", "--grpc_port=12346", "--status_port=22349", "--db_path=/var/shuffle" ],
"image": "dataflow.gcr.io/v1beta3/shuffle:20200109-14-rc00",
"imagePullPolicy": "Never",
"name": "shuffle",
"ports": [ {
"containerPort": 12345,
"hostPort": 12345,
"name": "sh-data-port"
}, {
"containerPort": 12346,
"hostPort": 12346,
"name": "sh-grpc-port"
}, {
"containerPort": 22349,
"hostPort": 22349,
"name": "sh-status-port"
} ],
"volumeMounts": [ {
"mountPath": "/var/shuffle",
"name": "dataflow-shuffle"
}, {
"mountPath": "/var/log/dataflow",
"name": "dataflowlogs-shuffle"
} ]
}, {
"args": [ "--teardown_interval=10m", "--log_file=/var/log/dataflow/vm_monitor-json.log", "--dataflow_base_path=https://dataflow.googleapis.com/", "--region=us-east1", "--teardown_policy=TEARDOWN_ALWAYS" ],
"image": "dataflow.gcr.io/v1beta3/vmmonitor:20200109-14-rc00",
"imagePullPolicy": "Never",
"name": "vmmonitor",
"volumeMounts": [ {
"mountPath": "/var/log/dataflow",
"name": "dataflowlogs-vmmonitor"
} ]
}, {
"args": [ "--log_file=/var/log/dataflow/health_checker-json.log", "--dataflow_base_path=https://dataflow.googleapis.com/", "--region=us-east1" ],
"image": "dataflow.gcr.io/v1beta3/healthchecker:20200109-14-rc00",
"imagePullPolicy": "Never",
"name": "healthchecker",
"volumeMounts": [ {
"mountPath": "/var/log/dataflow",
"name": "dataflowlogs-healthchecker"
} ]
} ],
"hostNetwork": true,
"volumes": [ {
"hostPath": {
"path": "/var/lib/agent/libshuffle_v1.so"
},
"name": "shuffle-lib-v1-python"
}, {
"hostPath": {
"path": "/var/lib/agent/libshuffle_v1_py3.so"
},
"name": "shuffle-lib-v1-python3"
}, {
"hostPath": {
"path": "/var/opt/google/dataflow"
},
"name": "persist"
}, {
"hostPath": {
"path": "/var/log/dataflow/taskrunner/harness"
},
"name": "dataflowlogs-harness"
}, {
"hostPath": {
"path": "/var/opt/dataflow/shuffle"
},
"name": "dataflow-shuffle"
}, {
"hostPath": {
"path": "/var/log/dataflow/shuffle"
},
"name": "dataflowlogs-shuffle"
}, {
"hostPath": {
"path": "/var/log/dataflow/vm_monitor"
},
"name": "dataflowlogs-vmmonitor"
}, {
"hostPath": {
"path": "/var/log/dataflow/health_checker"
},
"name": "dataflowlogs-healthchecker"
} ]
}
}
job_name: JOB_NAME
packages: gs://MYBUCKET/dataflow/staging/beamapp-lorenzo-0128125622-520176.1580216182.520259/pickled_main_session|pickled_main_session|gs://MYBUCKET/dataflow/staging/beamapp-lorenzo-0128125622-520176.1580216182.520259/dataflow_python_sdk.tar|dataflow_python_sdk.tar|gs://MYBUCKET/dataflow/staging/beamapp-lorenzo-0128125622-520176.1580216182.520259/apache_beam-2.16.0-cp36-cp36m-manylinux1_x86_64.whl|apache_beam-2.16.0-cp36-cp36m-manylinux1_x86_64.whl
sdk_pipeline_options: "{\"display_data\":[{\"key\":\"runner\",\"namespace\"\
:\"apache_beam.options.pipeline_options.PipelineOptions\",\"type\":\"STRING\"\
,\"value\":\"DataflowRunner\"},{\"key\":\"project\",\"namespace\":\"apache_beam.options.pipeline_options.PipelineOptions\"\
,\"type\":\"STRING\",\"value\":\"PROJECT_NAME\"},{\"key\":\"job_name\",\"namespace\"\
:\"apache_beam.options.pipeline_options.PipelineOptions\",\"type\":\"STRING\"\
,\"value\":\"beamapp-lorenzo-0128125622-520176\"},{\"key\":\"staging_location\"\
,\"namespace\":\"apache_beam.options.pipeline_options.PipelineOptions\",\"\
type\":\"STRING\",\"value\":\"gs://MYBUCKET/dataflow/staging/beamapp-lorenzo-0128125622-520176.1580216182.520259\"\
},{\"key\":\"temp_location\",\"namespace\":\"apache_beam.options.pipeline_options.PipelineOptions\"\
,\"type\":\"STRING\",\"value\":\"gs://TMPBUCKET/temp/beamapp-lorenzo-0128125622-520176.1580216182.520259\"\
},{\"key\":\"region\",\"namespace\":\"apache_beam.options.pipeline_options.PipelineOptions\"\
,\"type\":\"STRING\",\"value\":\"us-east1\"},{\"key\":\"template_location\"\
,\"namespace\":\"apache_beam.options.pipeline_options.PipelineOptions\",\"\
type\":\"STRING\",\"value\":\"gs://MYBUCKET/dataflow/templates/JOBNAME\"\
},{\"key\":\"max_num_workers\",\"namespace\":\"apache_beam.options.pipeline_options.PipelineOptions\"\
,\"type\":\"INTEGER\",\"value\":36},{\"key\":\"machine_type\",\"namespace\"\
:\"apache_beam.options.pipeline_options.PipelineOptions\",\"type\":\"STRING\"\
,\"value\":\"n1-custom-2-13312\"},{\"key\":\"use_public_ips\",\"namespace\"\
:\"apache_beam.options.pipeline_options.PipelineOptions\",\"type\":\"BOOLEAN\"\
,\"value\":false},{\"key\":\"experiments\",\"namespace\":\"apache_beam.options.pipeline_options.PipelineOptions\"\
,\"type\":\"STRING\",\"value\":\"['use_fastavro']\"},{\"key\":\"beam_plugins\"\
,\"namespace\":\"apache_beam.options.pipeline_options.PipelineOptions\",\"\
type\":\"STRING\",\"value\":\"['apache_beam.io.filesystem.FileSystem', 'apache_beam.io.hadoopfilesystem.HadoopFileSystem',\
\ 'apache_beam.io.localfilesystem.LocalFileSystem', 'apache_beam.io.gcp.gcsfilesystem.GCSFileSystem']\"\
},{\"key\":\"save_main_session\",\"namespace\":\"apache_beam.options.pipeline_options.PipelineOptions\"\
,\"type\":\"BOOLEAN\",\"value\":true},{\"key\":\"input\",\"namespace\":\"\
apache_beam.options.pipeline_options.PipelineOptions\",\"type\":\"STRING\"\
,\"value\":\"RuntimeValueProvider(option: input, type: str, default_value:\
\ None)\"},{\"key\":\"output\",\"namespace\":\"apache_beam.options.pipeline_options.PipelineOptions\"\
,\"type\":\"STRING\",\"value\":\"RuntimeValueProvider(option: output, type:\
\ str, default_value: None)\"},{\"key\":\"data4\",\"namespace\":\"apache_beam.options.pipeline_options.PipelineOptions\"\
,\"type\":\"STRING\",\"value\":\"RuntimeValueProvider(option: data4, type:\
\ str, default_value: None)\"},{\"key\":\"data3\",\"namespace\":\"\
apache_beam.options.pipeline_options.PipelineOptions\",\"type\":\"STRING\"\
,\"value\":\"RuntimeValueProvider(option: data3, type: str, default_value:\
\ None)\"},{\"key\":\"data2\",\"namespace\":\"apache_beam.options.pipeline_options.PipelineOptions\"\
,\"type\":\"STRING\",\"value\":\"RuntimeValueProvider(option: data2, type:\
\ str, default_value: None)\"},{\"key\":\"data1\",\"namespace\":\"\
apache_beam.options.pipeline_options.PipelineOptions\",\"type\":\"STRING\"\
,\"value\":\"RuntimeValueProvider(option: data1, type: str, default_value:\
\ None)\"},{\"key\":\"templateLocation\",\"namespace\":\"google.dataflow.v1beta3.TemplatesService\"\
,\"type\":\"STRING\",\"value\":\"gs://MYBUCKET/dataflow/templates/JOBNAME\"\
}],\"options\":{\"data3\":\"gs://MYBUCKET/FILENAME.json.b2\"\
,\"data2\":\"gs://MYBUCKET/FILENAME.json.b2\",\"autoscalingAlgorithm\"\
:\"NONE\",\"beam_plugins\":[\"apache_beam.io.filesystem.FileSystem\",\"apache_beam.io.hadoopfilesystem.HadoopFileSystem\"\
,\"apache_beam.io.localfilesystem.LocalFileSystem\",\"apache_beam.io.gcp.gcsfilesystem.GCSFileSystem\"\
],\"dataflowJobId\":\"2020-01-28_05_00_18-6144993012916173105\",\"dataflow_endpoint\"\
:\"https://dataflow.googleapis.com\",\"direct_num_workers\":1,\"direct_runner_bundle_repeat\"\
:0,\"direct_runner_use_stacked_bundle\":true,\"dry_run\":false,\"enable_streaming_engine\"\
:false,\"environment_cache_millis\":0,\"experiments\":[\"use_fastavro\"],\"\
gcpTempLocation\":\"gs://dataflow-staging-us-east1-657091687220/temp\",\"\
input\":\"gs://MYBUCKET/input.json\",\"job_name\":\"beamapp-lorenzo-0128125622-520176\"\
,\"machine_type\":\"n1-custom-2-13312\",\"data1\":\"gs://MYBUCKET/FILENAME.json.b2\"\
,\"maxNumWorkers\":36,\"max_num_workers\":36,\"no_auth\":false,\"numWorkers\"\
:3,\"output\":\"gs://MYBUCKET/outputs/qout.json\",\"pipelineUrl\":\"\
gs://MYBUCKET/dataflow/staging/beamapp-lorenzo-0128125622-520176.1580216182.520259/pipeline.pb\"\
,\"pipeline_type_check\":true,\"profile_cpu\":false,\"profile_memory\":false,\"\
profile_sample_rate\":1,\"project\":\"PROJECT_NAME\",\"data4\":\"gs://MYBUCKET/FILENAME.json.b2\"\
,\"region\":\"us-east1\",\"runner\":\"DataflowRunner\",\"runtime_type_check\"\
:false,\"save_main_session\":true,\"sdk_location\":\"default\",\"sdk_worker_parallelism\"\
:0,\"staging_location\":\"gs://MYBUCKET/dataflow/staging/beamapp-lorenzo-0128125622-520176.1580216182.520259\"\
,\"streaming\":false,\"temp_location\":\"gs://TMPBUCKET/temp/beamapp-lorenzo-0128125622-520176.1580216182.520259\"\
,\"templateLocation\":\"gs://MYBUCKET/dataflow/templates/JOBNAME\",\"template_location\"\
:\"gs://MYBUCKET/dataflow/templates/JOBNAME\",\"type_check_strictness\"\
:\"DEFAULT_TO_ANY\",\"update\":false,\"use_public_ips\":false}}"
shutdown-script: |-
#!/bin/bash
sudo /var/lib/agent/shutdown --dataflow_base_path=https://dataflow.googleapis.com/ --region=us-east1
user-data: |
#cloud-config
bootcmd:
- mount --bind /mnt/stateful_partition/var/lib/agent /var/lib/agent
- mount -o remount,rw,exec /var/lib/agent
- iptables -w -A INPUT -p tcp --dport 4194 -j ACCEPT
- iptables -w -A INPUT -p tcp --dport 5555 -j ACCEPT
- iptables -w -A INPUT -p tcp --dport 12345 -j ACCEPT
- iptables -w -A INPUT -p tcp --dport 12346 -j ACCEPT
- iptables -w -A INPUT -p tcp --dport 12347 -j ACCEPT
- mkdir -p /etc/systemd/network/99-virtio.network.d
- echo -e "[Network]\nDHCP=yes\nIPv6AcceptRA=yes" > /etc/systemd/network/99-virtio.network.d/ipv6.conf
- systemctl restart systemd-networkd
- sysctl -w net.ipv4.ipfrag_low_thresh=196608
- sysctl -w net.ipv4.ipfrag_high_thresh=262144
runcmd:
- sudo /bin/bash /var/lib/nvidia/setup_gpu.sh
- systemctl start agent.service
- systemctl start kubelet.service
- systemctl start resource.service
- sed "s/^/[PARTITION INFO]\t/" /proc/partitions
- df -h | sed "s/^/[FILESYSTEM INFO]\t/"
write_files:
-
content: |
[Unit]
Description=Start kubelet
Wants=network-online.target
After=docker.socket network-online.target
[Service]
ExecStartPre=/var/lib/agent/boot_checker --endpoint=https://dataflow.googleapis.com/ --region=us-east1
ExecStartPre=/bin/mkdir -p /etc/kubernetes/manifests
ExecStart=/usr/bin/kubelet --host-network-sources=* --manifest-url=http://metadata.google.internal/computeMetadata/v1/instance/attributes/google-container-manifest --manifest-url-header=Metadata-Flavor:Google --pod-manifest-path=/etc/kubernetes/manifests --allow-privileged=true --eviction-hard="" --image-gc-high-threshold=100
Restart=always
RestartSec=20
owner: root
path: /etc/systemd/system/kubelet.service
permission: 0644
-
content: |
[Unit]
Description=Start Dataflow host agent
Wants=network-online.target
After=network-online.target
[Service]
ExecStart=/var/lib/agent/agent --endpoint=https://dataflow.googleapis.com/ --region=us-east1
Restart=always
RestartSec=20
owner: root
path: /etc/systemd/system/agent.service
permission: 0644
-
content: |
[Unit]
Description=Start Dataflow resource capture agent
Wants=network-online.target
After=network-online.target
[Service]
ExecStart=/var/lib/agent/resource_capture --endpoint=https://dataflow.googleapis.com/ --region=us-east1
Restart=always
RestartSec=20
owner: root
path: /etc/systemd/system/resource.service
permission: 0644
network: default
numWorkers: 2
onHostMaintenance: MIGRATE
packages:
- location: storage.googleapis.com/MYBUCKET/dataflow/staging/beamapp-lorenzo-0128125622-520176.1580216182.520259/pickled_main_session
name: pickled_main_session
- location: storage.googleapis.com/MYBUCKET/dataflow/staging/beamapp-lorenzo-0128125622-520176.1580216182.520259/dataflow_python_sdk.tar
name: dataflow_python_sdk.tar
- location: storage.googleapis.com/MYBUCKET/dataflow/staging/beamapp-lorenzo-0128125622-520176.1580216182.520259/apache_beam-2.16.0-cp36-cp36m-manylinux1_x86_64.whl
name: apache_beam-2.16.0-cp36-cp36m-manylinux1_x86_64.whl
poolArgs: {}
teardownPolicy: TEARDOWN_ALWAYS
workerHarnessContainerImage: gcr.io/cloud-dataflow/v1beta3/python36:2.16.0
zone: us-east1-d
id: 2020-01-28_05_00_18-6144993012916173105
location: us-east1
name: JOB_NAME
state: Cancelled
stateTime: '2020-01-28 13:03:11'
type: Batch
解决方案
推荐阅读
- junit5 - 我可以将参数传递给junit中的beforeAll方法吗
- angular - 仅选择材质嵌套树中的父节点
- javascript - 仅在需要并在某处使用时才包含像 jQuery 这样的库
- javascript - 逻辑和 (&&) 类型的运算符仅适用于 Javascript 中的空类型
- javascript - autocomplete="on" 在 Firefox 浏览器中不起作用
- rsa - 即使代码没有改变,rsa 公式也会给出不同的结果
- swift - 如何更改 UIImagePickerController 搜索栏的背景颜色
- python - 检查数组Python列表中的元素
- laravel - 如何在 laravel 中使用会话 id 检索会话变量
- javascript - 保证一次最多打开 1 个 chrome 扩展弹出页面