Skip to content

Commit 780b10b

Browse files
authored
Merge pull request #40505 from github/repo-sync
Repo sync
2 parents 252aa5f + 1f39c0d commit 780b10b

File tree

13 files changed

+253
-15
lines changed

13 files changed

+253
-15
lines changed

content/copilot/concepts/index.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ children:
1111
- /completions
1212
- /chat
1313
- /agents
14+
- /spark
1415
- /prompting
1516
- /context
1617
- /auto-model-selection

content/copilot/concepts/spark.md

Lines changed: 114 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,114 @@
1+
---
2+
title: About GitHub Spark
3+
shortTitle: Spark
4+
intro: 'Learn about building and deploying intelligent apps with natural language using {% data variables.product.prodname_spark %}.'
5+
versions:
6+
feature: spark
7+
topics:
8+
- Copilot
9+
contentType: concepts
10+
---
11+
12+
## Overview
13+
14+
{% data reusables.copilot.spark-overview %}
15+
16+
## Benefits of using {% data variables.product.prodname_spark_short %}
17+
18+
{% data variables.product.prodname_spark_short %} can provide a wide range of benefits at all stages of app development.
19+
20+
### Build apps with natural language or code
21+
22+
You don't need to know how to code to build an app with {% data variables.product.prodname_spark_short %}. You can describe what you want your app to do in natural language, and {% data variables.product.prodname_spark_short %} will generate all the necessary code for you, along with a live, interactive preview of the app.
23+
24+
If you do want to explore and edit the code, you can simply open the code panel in {% data variables.product.prodname_spark_short %}, or go further and open your app in a {% data variables.product.github %} codespace (a cloud-based development environment).
25+
26+
See [AUTOTITLE](/codespaces/about-codespaces/what-are-codespaces).
27+
28+
### Leverage AI capabilities
29+
30+
{% data variables.product.prodname_spark_short %} is natively integrated with {% data variables.product.prodname_github_models %}, so you can add AI features to your app (for example, summarizing text or suggesting image tags) simply by prompting {% data variables.product.prodname_spark_short %}. {% data variables.product.prodname_spark_short %} will add the required inference components automatically, and you can edit the system prompts that control those capabilities yourself.
31+
32+
### Managed data store
33+
34+
If {% data variables.product.prodname_spark_short %} detects the need to store data in your app, it will automatically set up a managed key-value store, so you don't need to worry about setting up and managing a database. The data store runs on Azure (Cosmos DB) and it's intended for small records (up to 512 KB per entry).
35+
36+
### Built-in security protections
37+
38+
{% data variables.product.prodname_spark_short %} has built-in authentication, since users need to sign in with their {% data variables.product.github %} account to access your app. You control who has access to your app by setting visibility and data access options.
39+
40+
### One-click deployment
41+
42+
{% data variables.product.prodname_spark_short %} comes with a fully integrated runtime environment that allows you to deploy your app in one click. All necessary infrastructure is provisioned automatically, so you don't have to worry about setting up servers or managing deployments.
43+
44+
All sparks are hosted and deployed by Azure Container Apps (ACA).
45+
46+
### Fully integrated with {% data variables.product.github %}
47+
48+
{% data variables.product.prodname_spark_short %} is fully integrated with {% data variables.product.github %}, so you can use familiar tools and workflows to build and manage your app.
49+
50+
#### Work in {% data variables.product.prodname_github_codespaces %}
51+
52+
* You can open a {% data variables.product.github %} codespace (a cloud-based development environment) directly from {% data variables.product.prodname_spark_short %}, so you can continue building your app there, with access to {% data variables.product.prodname_copilot_short %} and all your usual development tools.
53+
54+
* There's automatic syncing between the codespace and {% data variables.product.prodname_spark_short %}, so you can seamlessly switch between the two environments.
55+
56+
#### Create a repository with two-way syncing
57+
58+
* You can create a repository for your spark in one click, allowing you to manage your app's code and collaborate with others using standard {% data variables.product.github %} workflows.
59+
60+
* There's a two-way sync between your spark and the repository, so changes made in either {% data variables.product.prodname_spark_short %} or the main branch of your repository are automatically reflected in both places. Any changes made to your spark prior to repository creation will be added to your repository so you have a full record of all changes and commits made to your spark since its creation.
61+
62+
#### Invite collaborators
63+
64+
* If you want to invite others to contribute to building your spark, you can add them as collaborators to your repository.
65+
66+
#### Leverage standard {% data variables.product.github %} features
67+
68+
* Once you've created a repository for your spark, you can use all the standard {% data variables.product.github %} features such as pull requests, issues, and project boards to manage your spark development process, as well as leverage {% data variables.product.prodname_actions %} for CI/CD workflows.
69+
70+
## Develop your spark with {% data variables.product.prodname_copilot_short %}
71+
72+
You can combine the functionality of {% data variables.product.prodname_spark %} with {% data variables.product.prodname_copilot %} to support your app development.
73+
74+
### {% data variables.product.prodname_copilot_short %} agent mode
75+
76+
When you open your spark in a {% data variables.product.github %} codespace, you have access to all of {% data variables.product.prodname_copilot_short %}'s capabilities, including {% data variables.copilot.copilot_chat_short %} and {% data variables.product.prodname_copilot_short %} agent mode.
77+
78+
Agent mode is useful when you have a specific task in mind and want to enable {% data variables.product.prodname_copilot_short %} to autonomously edit your code. In agent mode, {% data variables.product.prodname_copilot_short %} determines which files to make changes to, offers code changes and terminal commands to complete the task, and iterates to remediate issues until the original task is complete. You can take your app's development to the next level, as well as leveraging {% data variables.product.prodname_copilot_short %} to debug and troubleshoot issues in your code.
79+
80+
See [{% data variables.product.prodname_copilot_short %} agent mode](/copilot/how-tos/chat-with-copilot/chat-in-ide#agent-mode).
81+
82+
### {% data variables.copilot.copilot_coding_agent %}
83+
84+
Once your spark is connected to a {% data variables.product.github %} repository, you can use {% data variables.copilot.copilot_coding_agent %} to help you to continue to build and maintain your app, while you focus on other things.
85+
86+
With the coding agent, you delegate specific tasks to {% data variables.product.prodname_copilot_short %} (either by assigning an issue to {% data variables.product.prodname_copilot_short %}, or prompting {% data variables.product.prodname_copilot_short %} to create a pull request), and {% data variables.product.prodname_copilot_short %} will autonomously work in the background to complete the task. {% data variables.copilot.copilot_coding_agent %} can fix bugs, refactor code, improve test coverage and more.
87+
88+
See [AUTOTITLE](/copilot/concepts/agents/coding-agent/about-coding-agent).
89+
90+
## Sharing your spark
91+
92+
When you're ready to publish your spark, you can choose from the following visibility options:
93+
94+
* Private to you only
95+
* Visible to members of a specific organization on {% data variables.product.github %}
96+
* Visible to all {% data variables.product.github %} users.
97+
98+
You can then share your spark with others, so they can view and interact with your app. The link to your spark remains undiscoverable except for those who have the link.
99+
100+
Optionally, you can publish your spark as "read-only", meaning you can showcase your app to others without them being able to edit or delete app contents.
101+
102+
## Limitations of {% data variables.product.prodname_spark_short %}
103+
104+
{% data variables.product.prodname_spark_short %} uses an opinionated stack (React, TypeScript) for reliability. For best results, you should work within {% data variables.product.prodname_spark_short %}'s SDK and core framework.
105+
106+
You can add external libraries, but compatibility with {% data variables.product.prodname_spark_short %}'s SDK isn’t guaranteed. You should always test your spark thoroughly after adding any external libraries.
107+
108+
By default, your spark's data store is shared for all users of the published spark. You should make sure to delete any private or sensitive data from your app prior to making it visible to other users. You can optionally publish your spark as "read-only", meaning you can showcase your app to others without them being able to edit or delete app contents.
109+
110+
## Further reading
111+
112+
* [AUTOTITLE](/copilot/responsible-use/spark)
113+
* [AUTOTITLE](/copilot/tutorials/spark/build-apps-with-spark)
114+
* [AUTOTITLE](/copilot/how-tos/troubleshoot-copilot/troubleshoot-spark)

content/copilot/get-started/features.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -73,7 +73,7 @@ Create and manage collections of documentation to use as context for chatting wi
7373

7474
### {% data variables.product.prodname_spark %} ({% data variables.release-phases.public_preview %})
7575

76-
Build and deploy full-stack applications using natural-language prompts that seamlessly integrate with the {% data variables.product.github %} platform for advanced development. See [AUTOTITLE](/copilot/tutorials/building-ai-app-prototypes).
76+
Build and deploy full-stack applications using natural-language prompts that seamlessly integrate with the {% data variables.product.github %} platform for advanced development. See [AUTOTITLE](/copilot/tutorials/spark/build-apps-with-spark).
7777

7878
## {% data variables.product.prodname_copilot %} features for administrators
7979

content/copilot/how-tos/troubleshoot-copilot/index.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ children:
1010
- /view-logs
1111
- /troubleshoot-firewall-settings
1212
- /troubleshoot-network-errors
13+
- /troubleshoot-spark
1314
redirect_from:
1415
- /copilot/troubleshooting-github-copilot
1516
- /copilot/how-tos/troubleshoot
Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
---
2+
title: Troubleshooting common issues with GitHub Spark
3+
intro: 'This guide describes common issues with {% data variables.product.prodname_spark_short %} and how to resolve them.'
4+
versions:
5+
feature: spark
6+
topics:
7+
- Copilot
8+
shortTitle: Troubleshoot Spark
9+
contentType: how-tos
10+
---
11+
12+
## Error: "Live preview is interrupted. Try refreshing the page to reconnect."
13+
14+
There is a known compatibility issue between Apple's Safari browser and the way {% data variables.product.prodname_spark_short %} renders its live preview.
15+
16+
To resolve the issue, switch to a different browser such as Google Chrome, Microsoft Edge or Mozilla Firefox.
17+
18+
## Error: "HTTP 413 status code ("Payload Too Large")"
19+
20+
{% data variables.product.prodname_spark_short %} uses a key-value store for app data. The combined size of the key (the label) and payload (the actual data) must be less than 512kB. If you save data over this limit, you'll get a HTTP 413 status code ("Payload Too Large") error.
21+
22+
To resolve the error, reduce the size of data you're trying to save, or split the data into smaller records.
23+
24+
## App fails to build after adding an external library
25+
26+
{% data variables.product.prodname_spark_short %} uses an opinionated stack (React, TypeScript) for reliability. You can add external libraries to your spark, but compatibility isn’t guaranteed and you should test additions thoroughly. For best results, you should work within {% data variables.product.prodname_spark_short %}'s SDK and core framework.

content/copilot/responsible-use/spark.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ contentType: rai
2323

2424
### Input processing
2525

26-
> [!NOTE] {% data variables.product.prodname_spark_short %} currently leverages {% data variables.copilot.copilot_claude_sonnet_40 %}. This model is subject to change.
26+
{% data reusables.copilot.spark-model %}
2727

2828
Input prompts in {% data variables.product.prodname_spark_short %} are pre-processed by {% data variables.product.prodname_copilot_short %}, augmented with contextual information from your current {% data variables.product.prodname_spark_short %} inputs and sent to a large language model powered agent within your development environment. Included context includes information from your spark such as code from your current application, previous prompts supplied in the {% data variables.product.prodname_spark_short %} interface, and any error logs from your spark’s development environment.
2929

content/copilot/tutorials/index.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,11 +10,10 @@ children:
1010
- /copilot-chat-cookbook
1111
- /customization-library
1212
- /coding-agent
13+
- /spark
1314
- /enhance-agent-mode-with-mcp
1415
- /compare-ai-models
1516
- /speed-up-development-work
16-
- /easy-apps-with-spark
17-
- /build-apps-with-spark
1817
- /roll-out-at-scale
1918
- /explore-a-codebase
2019
- /explore-issues-and-discussions

content/copilot/tutorials/build-apps-with-spark.md renamed to content/copilot/tutorials/spark/build-apps-with-spark.md

Lines changed: 6 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
title: Building and deploying AI-powered apps with GitHub Spark
3-
shortTitle: Build apps with Spark
3+
shortTitle: Build and deploy apps
44
allowTitleToDifferFromFilename: true
55
intro: 'Learn how to build and deploy an intelligent web app with natural language using {% data variables.product.prodname_spark %}.'
66
versions:
@@ -10,6 +10,7 @@ topics:
1010
- Copilot
1111
redirect_from:
1212
- /copilot/tutorials/building-ai-app-prototypes
13+
- /copilot/tutorials/build-apps-with-spark
1314
contentType: tutorials
1415
---
1516

@@ -19,11 +20,9 @@ contentType: tutorials
1920
2021
## Introduction
2122

22-
With {% data variables.product.prodname_spark %}, you can describe what you want in natural language and get a fullstack web app with data storage, AI features, and {% data variables.product.github %} authentication built in. You can iterate using prompts, visual tools, or code, and then deploy with a click to a fully managed runtime.
23+
{% data reusables.copilot.spark-overview %}
2324

24-
{% data variables.product.prodname_spark_short %} is seamlessly integrated with {% data variables.product.github %} so you can develop your spark via a synced {% data variables.product.github %} codespace with {% data variables.product.prodname_copilot_short %} for advanced editing. You can also create a repository for team collaboration, and leverage {% data variables.product.github %}'s ecosystem of tools and integrations.
25-
26-
This tutorial will guide you through building and deploying an app with {% data variables.product.prodname_spark_short %} and exploring its features.
25+
This tutorial will guide you through the full lifecycle of building and deploying an app with {% data variables.product.prodname_spark_short %} and exploring its features.
2726

2827
### Prerequisites
2928

@@ -49,7 +48,7 @@ For this tutorial, we'll create a simple marketing tool app, where:
4948
```
5049

5150
> [!TIP]
52-
> * Be specific, and provide as many details as possible for the best results. You can [{% data variables.copilot.copilot_chat_short %}](https://github.com/copilot) to refine or suggest improvements to your initial prompt.
51+
> * Be specific, and provide as many details as possible for the best results. You can ask [{% data variables.copilot.copilot_chat_short %}](https://github.com/copilot) to refine or suggest improvements to your initial prompt.
5352
> * Alternatively, drop a markdown document into the input field to provide {% data variables.product.prodname_spark_short %} with more context on what you're hoping to build.
5453
5554
1. Optionally, upload an image to provide {% data variables.product.prodname_spark_short %} with a visual reference for your app. Mocks, sketches, or screenshots all work to provide {% data variables.product.prodname_spark_short %} with an idea of what you want to build.
@@ -146,7 +145,7 @@ You can view or edit your app’s code directly in {% data variables.product.pro
146145
> * You can also choose to share your spark as **read-only** so that other users can view your app's content, but they cannot edit content, delete files or records, or create new items.
147146
148147
1. In the top right corner, click **Publish**.
149-
1. By default, your spark will be private and only accessible to you. Under "Visibility", choose whether you want your spark to remain private, or make it available to all {% data variables.product.github %} users.
148+
1. By default, your spark will be private and only accessible to you. Under "Visibility", choose whether you want your spark to remain private, or make it available to members of a specific organization on {% data variables.product.github %}, or all {% data variables.product.github %} users.
150149

151150
![Screenshot of the {% data variables.product.prodname_spark %} publication menu. The "All {% data variables.product.github %} users" visibility option is outlined in orange.](/assets/images/help/copilot/spark-github-user-visibility.png)
152151

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
---
2+
title: Building apps with GitHub Spark
3+
shortTitle: Spark
4+
intro: 'Learn how to build and deploy an app using natural language with GitHub Spark.'
5+
versions:
6+
feature: spark
7+
topics:
8+
- Copilot
9+
children:
10+
- /your-first-spark
11+
- /prompt-tips
12+
- /build-apps-with-spark
13+
contentType: tutorials
14+
---

0 commit comments

Comments
 (0)