Batch Processing

Package luigi dask PySpark mrjob Ray
Package luigi dask PySpark mrjob Ray
Description Luigi is a Python module that
helps you build complex
pipelines of batch jobs. It
handles dependency resolution,
workflow management, ...
Parallel computing with task
scheduling
[Apache
Spark](https://spark.apache.org/)
Python API.
Run MapReduce jobs on Hadoop
or Amazon Web Services.
A system for parallel and
distributed Python that
unifies the machine learning
ecosystem.
CategoryInstallable PackageInstallable PackageInstallable PackageInstallable PackageInstallable Package
# Using This00000
Python 3?
Development Status n/a n/a n/a n/a n/a
Last updated Jan. 16, 2020, 2:07 p.m. Jan. 17, 2020, 3:52 p.m.
Versionn/an/an/an/an/a
RepoGithubGithubOtherGithubGithub
Commits
Stars127736170n/an/an/a
Repo Forks2075983n/an/an/a
Participantserikbern
Tarrasch
daveFNbuck
freider
gpoulin
dlstadther
ulzha
themalkolm
honnix
DavW
more...
mrocklin
jcrist
cowlicks
jakirkham
jrbourbeau
sinhrks
martindurant
TomAugspurger
cpcloud
shoyer
more...
Documentation N/A N/A N/A N/A N/A
Search WeightPackageDescriptionLast PyPI release:Repo ForksStars
{{ item.weight / max_weight * 100 | number:0 }}%{{ item.title }}Grid: {{ item.description }} {{ item.last_released | date: 'mediumDate' }} N/A {{ item.repo_forks }} N/A {{ item.repo_watchers }} N/A