# PySpark
Here is our current config:
* 1 Driver, 64 GB Memory, 16 Cores
* 8 Workers, each with 64 GB Memory, 16 Cores
* Each worker can handle 4 pieces of work at once. Each piece of work will be allocated 4 cores.
* **Worker**: A computing unit that performs computations on a distributed dataset.
* Executor seems to be a worker?

[PySpark: Workers, Executors, Threads.](https://chat.openai.com/share/4cdd3504-68e4-4684-92db-85583ecd6023)
[Fixing PySpark "Remote RPC" Error](https://chat.openai.com/share/1085f112-6c31-4649-8318-37d1386421f4)
---
Date: 20230724
Links to:
Tags:
References:
* []()