apache-spark - Spark Difference between local and standalone
问题描述
If I run Spark with local[*] and then Standalone later with 2 workers (both workers are referring to the same 1 computer), is there a difference?
解决方案
Standalone you are defining "containers" for the worker and spark master to run in your machine (so you can have n workers and your tasks can be distributed in the JVM of those two workers?) but in local mode you are just running everything in the same JVM in your local machine with n threads.
推荐阅读
- python - PyQt5中QTableWidget单元格中QWidget中的QLineEdit垂直居中?
- android - android企业中的应用内计费库
- command-line - 使用 shellexecuteW 的控制台输出
- javascript - 从输入类型文件中删除特定文件(多个)
- java - 如何隐藏导航栏而不隐藏状态栏
- .net - Nuget 正在尝试恢复错误版本的包,跳过修订号
- android - Android Studio 不更新布局设计的更改?
- linux - 在 docker 镜像中使用私有 rsa 密钥而不暴露它
- flutter - Flutter Web:MaterialApp Title 每次弹出时都会改变
- python - 具有多个日期时间的 Pandas 数据框