Use Case

Any Python code, in parallel

You have a Python script. It works. You just need to run it on 100x more data.

Dask makes it easy to convert any Python code to run in parallel. Coiled makes it easy to scale that out to many machines in the cloud.

Sometimes, you don’t need a fancy distributed dataframe or database. Dask is good for just scaling out a forloop. It’s also highly flexible, so you can make complex, custom task graphs that would be hard to express in other systems.

Dask + Coiled

Docs

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat.

Submit arbitrary functions for computation in a parallelized, eager, and non-blocking way.

Explore
Learn More arrow

Dask Bag implements operations like map, filter, groupby and aggregations on collections of Python objects.

Explore
Learn More arrow

What if you don’t have an array or dataframe? Instead of having blocks where the function is applied to each block, you can decorate functions with @delayed and have the functions themselves be lazy.

Explore
Learn More arrow
Heading

What if you don’t have an array or dataframe? Instead of having blocks where the function is applied to each block, you can decorate functions with @delayed and have the functions themselves be lazy.

Explore
Learn More arrow
Heading

What if you don’t have an array or dataframe? Instead of having blocks where the function is applied to each block, you can decorate functions with @delayed and have the functions themselves be lazy.

Explore
Learn More arrow
Heading

What if you don’t have an array or dataframe? Instead of having blocks where the function is applied to each block, you can decorate functions with @delayed and have the functions themselves be lazy.

Explore
Learn More arrow

With GitHub, Google or email.

Use your AWS or GCP account.

Start scaling.

$ pip install coiled
$ coiled setup
$ ipython
>>> import coiled
>>> cluster = coiled.Cluster(n_workers=500)