diff --git a/README.md b/README.md
index 4b43947..afe9ad0 100644
--- a/README.md
+++ b/README.md
@@ -2,15 +2,27 @@
 
 Latent blending allows you to generate smooth video transitions between two prompts. It is based on [stable diffusion 2.1](https://stability.ai/blog/stablediffusion2-1-release7-dec-2022) and remixes the latent reprensetation using spherical linear interpolations. This results in imperceptible transitions, where one image slowly turns into another one. 
 
-# Example 1: Simple transition
+# Quickstart
+```python
+fp_ckpt = 'path_to_SD2.ckpt'
+fp_config = 'path_to_config.yaml'
+sdh = StableDiffusionHolder(fp_ckpt, fp_config, device)
+lb = LatentBlending(sdh)
+lb.load_branching_profile(quality='medium', depth_strength=0.4)
+lb.set_prompt1('photo of my first prompt1')
+lb.set_prompt2('photo of my second prompt')
+imgs_transition = lb.run_transition()
+```
+
+## Example 1: Simple transition
 ![](example1.jpg)
 To run a simple transition between two prompts, run `example1_standard.py`
 
-# Example 2: Inpainting transition
+## Example 2: Inpainting transition
 ![](example2.jpg)
 To run a transition between two prompts where you want some part of the image to remain static, run `example2_inpaint.py`
 
-# Example 3: concatenated transition
+## Example 3: concatenated transition
 To run multiple transition between K prompts, resulting in a stitched video, run `example3_multitrans.py`
 
 # Relevant parameters