Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions content/copilot/concepts/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ children:
- /completions
- /chat
- /agents
- /spark
- /prompting
- /context
- /auto-model-selection
Expand Down
114 changes: 114 additions & 0 deletions content/copilot/concepts/spark.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
---
title: About GitHub Spark
shortTitle: Spark
intro: 'Learn about building and deploying intelligent apps with natural language using {% data variables.product.prodname_spark %}.'
versions:
feature: spark
topics:
- Copilot
contentType: concepts
---

## Overview

{% data reusables.copilot.spark-overview %}

## Benefits of using {% data variables.product.prodname_spark_short %}

{% data variables.product.prodname_spark_short %} can provide a wide range of benefits at all stages of app development.

### Build apps with natural language or code

You don't need to know how to code to build an app with {% data variables.product.prodname_spark_short %}. You can describe what you want your app to do in natural language, and {% data variables.product.prodname_spark_short %} will generate all the necessary code for you, along with a live, interactive preview of the app.

If you do want to explore and edit the code, you can simply open the code panel in {% data variables.product.prodname_spark_short %}, or go further and open your app in a {% data variables.product.github %} codespace (a cloud-based development environment).

See [AUTOTITLE](/codespaces/about-codespaces/what-are-codespaces).

### Leverage AI capabilities

{% data variables.product.prodname_spark_short %} is natively integrated with {% data variables.product.prodname_github_models %}, so you can add AI features to your app (for example, summarizing text or suggesting image tags) simply by prompting {% data variables.product.prodname_spark_short %}. {% data variables.product.prodname_spark_short %} will add the required inference components automatically, and you can edit the system prompts that control those capabilities yourself.

### Managed data store

If {% data variables.product.prodname_spark_short %} detects the need to store data in your app, it will automatically set up a managed key-value store, so you don't need to worry about setting up and managing a database. The data store runs on Azure (Cosmos DB) and it's intended for small records (up to 512 KB per entry).

### Built-in security protections

{% data variables.product.prodname_spark_short %} has built-in authentication, since users need to sign in with their {% data variables.product.github %} account to access your app. You control who has access to your app by setting visibility and data access options.

### One-click deployment

{% data variables.product.prodname_spark_short %} comes with a fully integrated runtime environment that allows you to deploy your app in one click. All necessary infrastructure is provisioned automatically, so you don't have to worry about setting up servers or managing deployments.

All sparks are hosted and deployed by Azure Container Apps (ACA).

### Fully integrated with {% data variables.product.github %}

{% data variables.product.prodname_spark_short %} is fully integrated with {% data variables.product.github %}, so you can use familiar tools and workflows to build and manage your app.

#### Work in {% data variables.product.prodname_github_codespaces %}

* You can open a {% data variables.product.github %} codespace (a cloud-based development environment) directly from {% data variables.product.prodname_spark_short %}, so you can continue building your app there, with access to {% data variables.product.prodname_copilot_short %} and all your usual development tools.

* There's automatic syncing between the codespace and {% data variables.product.prodname_spark_short %}, so you can seamlessly switch between the two environments.

#### Create a repository with two-way syncing

* You can create a repository for your spark in one click, allowing you to manage your app's code and collaborate with others using standard {% data variables.product.github %} workflows.

* There's a two-way sync between your spark and the repository, so changes made in either {% data variables.product.prodname_spark_short %} or the main branch of your repository are automatically reflected in both places. Any changes made to your spark prior to repository creation will be added to your repository so you have a full record of all changes and commits made to your spark since its creation.

#### Invite collaborators

* If you want to invite others to contribute to building your spark, you can add them as collaborators to your repository.

#### Leverage standard {% data variables.product.github %} features

* Once you've created a repository for your spark, you can use all the standard {% data variables.product.github %} features such as pull requests, issues, and project boards to manage your spark development process, as well as leverage {% data variables.product.prodname_actions %} for CI/CD workflows.

## Develop your spark with {% data variables.product.prodname_copilot_short %}

You can combine the functionality of {% data variables.product.prodname_spark %} with {% data variables.product.prodname_copilot %} to support your app development.

### {% data variables.product.prodname_copilot_short %} agent mode

When you open your spark in a {% data variables.product.github %} codespace, you have access to all of {% data variables.product.prodname_copilot_short %}'s capabilities, including {% data variables.copilot.copilot_chat_short %} and {% data variables.product.prodname_copilot_short %} agent mode.

Agent mode is useful when you have a specific task in mind and want to enable {% data variables.product.prodname_copilot_short %} to autonomously edit your code. In agent mode, {% data variables.product.prodname_copilot_short %} determines which files to make changes to, offers code changes and terminal commands to complete the task, and iterates to remediate issues until the original task is complete. You can take your app's development to the next level, as well as leveraging {% data variables.product.prodname_copilot_short %} to debug and troubleshoot issues in your code.

See [{% data variables.product.prodname_copilot_short %} agent mode](/copilot/how-tos/chat-with-copilot/chat-in-ide#agent-mode).

### {% data variables.copilot.copilot_coding_agent %}

Once your spark is connected to a {% data variables.product.github %} repository, you can use {% data variables.copilot.copilot_coding_agent %} to help you to continue to build and maintain your app, while you focus on other things.

With the coding agent, you delegate specific tasks to {% data variables.product.prodname_copilot_short %} (either by assigning an issue to {% data variables.product.prodname_copilot_short %}, or prompting {% data variables.product.prodname_copilot_short %} to create a pull request), and {% data variables.product.prodname_copilot_short %} will autonomously work in the background to complete the task. {% data variables.copilot.copilot_coding_agent %} can fix bugs, refactor code, improve test coverage and more.

See [AUTOTITLE](/copilot/concepts/agents/coding-agent/about-coding-agent).

## Sharing your spark

When you're ready to publish your spark, you can choose from the following visibility options:

* Private to you only
* Visible to members of a specific organization on {% data variables.product.github %}
* Visible to all {% data variables.product.github %} users.

You can then share your spark with others, so they can view and interact with your app. The link to your spark remains undiscoverable except for those who have the link.

Optionally, you can publish your spark as "read-only", meaning you can showcase your app to others without them being able to edit or delete app contents.

## Limitations of {% data variables.product.prodname_spark_short %}

{% data variables.product.prodname_spark_short %} uses an opinionated stack (React, TypeScript) for reliability. For best results, you should work within {% data variables.product.prodname_spark_short %}'s SDK and core framework.

You can add external libraries, but compatibility with {% data variables.product.prodname_spark_short %}'s SDK isn’t guaranteed. You should always test your spark thoroughly after adding any external libraries.

By default, your spark's data store is shared for all users of the published spark. You should make sure to delete any private or sensitive data from your app prior to making it visible to other users. You can optionally publish your spark as "read-only", meaning you can showcase your app to others without them being able to edit or delete app contents.

## Further reading

* [AUTOTITLE](/copilot/responsible-use/spark)
* [AUTOTITLE](/copilot/tutorials/spark/build-apps-with-spark)
* [AUTOTITLE](/copilot/how-tos/troubleshoot-copilot/troubleshoot-spark)
2 changes: 1 addition & 1 deletion content/copilot/get-started/features.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ Create and manage collections of documentation to use as context for chatting wi

### {% data variables.product.prodname_spark %} ({% data variables.release-phases.public_preview %})

Build and deploy full-stack applications using natural-language prompts that seamlessly integrate with the {% data variables.product.github %} platform for advanced development. See [AUTOTITLE](/copilot/tutorials/building-ai-app-prototypes).
Build and deploy full-stack applications using natural-language prompts that seamlessly integrate with the {% data variables.product.github %} platform for advanced development. See [AUTOTITLE](/copilot/tutorials/spark/build-apps-with-spark).

## {% data variables.product.prodname_copilot %} features for administrators

Expand Down
1 change: 1 addition & 0 deletions content/copilot/how-tos/troubleshoot-copilot/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ children:
- /view-logs
- /troubleshoot-firewall-settings
- /troubleshoot-network-errors
- /troubleshoot-spark
redirect_from:
- /copilot/troubleshooting-github-copilot
- /copilot/how-tos/troubleshoot
Expand Down
26 changes: 26 additions & 0 deletions content/copilot/how-tos/troubleshoot-copilot/troubleshoot-spark.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
---
title: Troubleshooting common issues with GitHub Spark
intro: 'This guide describes common issues with {% data variables.product.prodname_spark_short %} and how to resolve them.'
versions:
feature: spark
topics:
- Copilot
shortTitle: Troubleshoot Spark
contentType: how-tos
---

## Error: "Live preview is interrupted. Try refreshing the page to reconnect."

There is a known compatibility issue between Apple's Safari browser and the way {% data variables.product.prodname_spark_short %} renders its live preview.

To resolve the issue, switch to a different browser such as Google Chrome, Microsoft Edge or Mozilla Firefox.

## Error: "HTTP 413 status code ("Payload Too Large")"

{% data variables.product.prodname_spark_short %} uses a key-value store for app data. The combined size of the key (the label) and payload (the actual data) must be less than 512kB. If you save data over this limit, you'll get a HTTP 413 status code ("Payload Too Large") error.

To resolve the error, reduce the size of data you're trying to save, or split the data into smaller records.

## App fails to build after adding an external library

{% data variables.product.prodname_spark_short %} uses an opinionated stack (React, TypeScript) for reliability. You can add external libraries to your spark, but compatibility isn’t guaranteed and you should test additions thoroughly. For best results, you should work within {% data variables.product.prodname_spark_short %}'s SDK and core framework.
2 changes: 1 addition & 1 deletion content/copilot/responsible-use/spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ contentType: rai

### Input processing

> [!NOTE] {% data variables.product.prodname_spark_short %} currently leverages {% data variables.copilot.copilot_claude_sonnet_40 %}. This model is subject to change.
{% data reusables.copilot.spark-model %}

Input prompts in {% data variables.product.prodname_spark_short %} are pre-processed by {% data variables.product.prodname_copilot_short %}, augmented with contextual information from your current {% data variables.product.prodname_spark_short %} inputs and sent to a large language model powered agent within your development environment. Included context includes information from your spark such as code from your current application, previous prompts supplied in the {% data variables.product.prodname_spark_short %} interface, and any error logs from your spark’s development environment.

Expand Down
3 changes: 1 addition & 2 deletions content/copilot/tutorials/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,10 @@ children:
- /copilot-chat-cookbook
- /customization-library
- /coding-agent
- /spark
- /enhance-agent-mode-with-mcp
- /compare-ai-models
- /speed-up-development-work
- /easy-apps-with-spark
- /build-apps-with-spark
- /roll-out-at-scale
- /explore-a-codebase
- /explore-issues-and-discussions
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: Building and deploying AI-powered apps with GitHub Spark
shortTitle: Build apps with Spark
shortTitle: Build and deploy apps
allowTitleToDifferFromFilename: true
intro: 'Learn how to build and deploy an intelligent web app with natural language using {% data variables.product.prodname_spark %}.'
versions:
Expand All @@ -10,6 +10,7 @@ topics:
- Copilot
redirect_from:
- /copilot/tutorials/building-ai-app-prototypes
- /copilot/tutorials/build-apps-with-spark
contentType: tutorials
---

Expand All @@ -19,11 +20,9 @@ contentType: tutorials

## Introduction

With {% data variables.product.prodname_spark %}, you can describe what you want in natural language and get a fullstack web app with data storage, AI features, and {% data variables.product.github %} authentication built in. You can iterate using prompts, visual tools, or code, and then deploy with a click to a fully managed runtime.
{% data reusables.copilot.spark-overview %}

{% data variables.product.prodname_spark_short %} is seamlessly integrated with {% data variables.product.github %} so you can develop your spark via a synced {% data variables.product.github %} codespace with {% data variables.product.prodname_copilot_short %} for advanced editing. You can also create a repository for team collaboration, and leverage {% data variables.product.github %}'s ecosystem of tools and integrations.

This tutorial will guide you through building and deploying an app with {% data variables.product.prodname_spark_short %} and exploring its features.
This tutorial will guide you through the full lifecycle of building and deploying an app with {% data variables.product.prodname_spark_short %} and exploring its features.

### Prerequisites

Expand All @@ -49,7 +48,7 @@ For this tutorial, we'll create a simple marketing tool app, where:
```

> [!TIP]
> * Be specific, and provide as many details as possible for the best results. You can [{% data variables.copilot.copilot_chat_short %}](https://github.com/copilot) to refine or suggest improvements to your initial prompt.
> * Be specific, and provide as many details as possible for the best results. You can ask [{% data variables.copilot.copilot_chat_short %}](https://github.com/copilot) to refine or suggest improvements to your initial prompt.
> * Alternatively, drop a markdown document into the input field to provide {% data variables.product.prodname_spark_short %} with more context on what you're hoping to build.

1. Optionally, upload an image to provide {% data variables.product.prodname_spark_short %} with a visual reference for your app. Mocks, sketches, or screenshots all work to provide {% data variables.product.prodname_spark_short %} with an idea of what you want to build.
Expand Down Expand Up @@ -146,7 +145,7 @@ You can view or edit your app’s code directly in {% data variables.product.pro
> * You can also choose to share your spark as **read-only** so that other users can view your app's content, but they cannot edit content, delete files or records, or create new items.

1. In the top right corner, click **Publish**.
1. By default, your spark will be private and only accessible to you. Under "Visibility", choose whether you want your spark to remain private, or make it available to all {% data variables.product.github %} users.
1. By default, your spark will be private and only accessible to you. Under "Visibility", choose whether you want your spark to remain private, or make it available to members of a specific organization on {% data variables.product.github %}, or all {% data variables.product.github %} users.

![Screenshot of the {% data variables.product.prodname_spark %} publication menu. The "All {% data variables.product.github %} users" visibility option is outlined in orange.](/assets/images/help/copilot/spark-github-user-visibility.png)

Expand Down
14 changes: 14 additions & 0 deletions content/copilot/tutorials/spark/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
---
title: Building apps with GitHub Spark
shortTitle: Spark
intro: 'Learn how to build and deploy an app using natural language with GitHub Spark.'
versions:
feature: spark
topics:
- Copilot
children:
- /your-first-spark
- /prompt-tips
- /build-apps-with-spark
contentType: tutorials
---
Loading
Loading