|
| 1 | +--- |
| 2 | +title: About GitHub Spark |
| 3 | +shortTitle: Spark |
| 4 | +intro: 'Learn about building and deploying intelligent apps with natural language using {% data variables.product.prodname_spark %}.' |
| 5 | +versions: |
| 6 | + feature: spark |
| 7 | +topics: |
| 8 | + - Copilot |
| 9 | +contentType: concepts |
| 10 | +--- |
| 11 | + |
| 12 | +## Overview |
| 13 | + |
| 14 | +{% data reusables.copilot.spark-overview %} |
| 15 | + |
| 16 | +## Benefits of using {% data variables.product.prodname_spark_short %} |
| 17 | + |
| 18 | +{% data variables.product.prodname_spark_short %} can provide a wide range of benefits at all stages of app development. |
| 19 | + |
| 20 | +### Build apps with natural language or code |
| 21 | + |
| 22 | +You don't need to know how to code to build an app with {% data variables.product.prodname_spark_short %}. You can describe what you want your app to do in natural language, and {% data variables.product.prodname_spark_short %} will generate all the necessary code for you, along with a live, interactive preview of the app. |
| 23 | + |
| 24 | +If you do want to explore and edit the code, you can simply open the code panel in {% data variables.product.prodname_spark_short %}, or go further and open your app in a {% data variables.product.github %} codespace (a cloud-based development environment). |
| 25 | + |
| 26 | +See [AUTOTITLE](/codespaces/about-codespaces/what-are-codespaces). |
| 27 | + |
| 28 | +### Leverage AI capabilities |
| 29 | + |
| 30 | +{% data variables.product.prodname_spark_short %} is natively integrated with {% data variables.product.prodname_github_models %}, so you can add AI features to your app (for example, summarizing text or suggesting image tags) simply by prompting {% data variables.product.prodname_spark_short %}. {% data variables.product.prodname_spark_short %} will add the required inference components automatically, and you can edit the system prompts that control those capabilities yourself. |
| 31 | + |
| 32 | +### Managed data store |
| 33 | + |
| 34 | +If {% data variables.product.prodname_spark_short %} detects the need to store data in your app, it will automatically set up a managed key-value store, so you don't need to worry about setting up and managing a database. The data store runs on Azure (Cosmos DB) and it's intended for small records (up to 512 KB per entry). |
| 35 | + |
| 36 | +### Built-in security protections |
| 37 | + |
| 38 | +{% data variables.product.prodname_spark_short %} has built-in authentication, since users need to sign in with their {% data variables.product.github %} account to access your app. You control who has access to your app by setting visibility and data access options. |
| 39 | + |
| 40 | +### One-click deployment |
| 41 | + |
| 42 | +{% data variables.product.prodname_spark_short %} comes with a fully integrated runtime environment that allows you to deploy your app in one click. All necessary infrastructure is provisioned automatically, so you don't have to worry about setting up servers or managing deployments. |
| 43 | + |
| 44 | +All sparks are hosted and deployed by Azure Container Apps (ACA). |
| 45 | + |
| 46 | +### Fully integrated with {% data variables.product.github %} |
| 47 | + |
| 48 | +{% data variables.product.prodname_spark_short %} is fully integrated with {% data variables.product.github %}, so you can use familiar tools and workflows to build and manage your app. |
| 49 | + |
| 50 | +#### Work in {% data variables.product.prodname_github_codespaces %} |
| 51 | + |
| 52 | +* You can open a {% data variables.product.github %} codespace (a cloud-based development environment) directly from {% data variables.product.prodname_spark_short %}, so you can continue building your app there, with access to {% data variables.product.prodname_copilot_short %} and all your usual development tools. |
| 53 | + |
| 54 | +* There's automatic syncing between the codespace and {% data variables.product.prodname_spark_short %}, so you can seamlessly switch between the two environments. |
| 55 | + |
| 56 | +#### Create a repository with two-way syncing |
| 57 | + |
| 58 | +* You can create a repository for your spark in one click, allowing you to manage your app's code and collaborate with others using standard {% data variables.product.github %} workflows. |
| 59 | + |
| 60 | +* There's a two-way sync between your spark and the repository, so changes made in either {% data variables.product.prodname_spark_short %} or the main branch of your repository are automatically reflected in both places. Any changes made to your spark prior to repository creation will be added to your repository so you have a full record of all changes and commits made to your spark since its creation. |
| 61 | + |
| 62 | +#### Invite collaborators |
| 63 | + |
| 64 | +* If you want to invite others to contribute to building your spark, you can add them as collaborators to your repository. |
| 65 | + |
| 66 | +#### Leverage standard {% data variables.product.github %} features |
| 67 | + |
| 68 | +* Once you've created a repository for your spark, you can use all the standard {% data variables.product.github %} features such as pull requests, issues, and project boards to manage your spark development process, as well as leverage {% data variables.product.prodname_actions %} for CI/CD workflows. |
| 69 | + |
| 70 | +## Develop your spark with {% data variables.product.prodname_copilot_short %} |
| 71 | + |
| 72 | +You can combine the functionality of {% data variables.product.prodname_spark %} with {% data variables.product.prodname_copilot %} to support your app development. |
| 73 | + |
| 74 | +### {% data variables.product.prodname_copilot_short %} agent mode |
| 75 | + |
| 76 | +When you open your spark in a {% data variables.product.github %} codespace, you have access to all of {% data variables.product.prodname_copilot_short %}'s capabilities, including {% data variables.copilot.copilot_chat_short %} and {% data variables.product.prodname_copilot_short %} agent mode. |
| 77 | + |
| 78 | +Agent mode is useful when you have a specific task in mind and want to enable {% data variables.product.prodname_copilot_short %} to autonomously edit your code. In agent mode, {% data variables.product.prodname_copilot_short %} determines which files to make changes to, offers code changes and terminal commands to complete the task, and iterates to remediate issues until the original task is complete. You can take your app's development to the next level, as well as leveraging {% data variables.product.prodname_copilot_short %} to debug and troubleshoot issues in your code. |
| 79 | + |
| 80 | +See [{% data variables.product.prodname_copilot_short %} agent mode](/copilot/how-tos/chat-with-copilot/chat-in-ide#agent-mode). |
| 81 | + |
| 82 | +### {% data variables.copilot.copilot_coding_agent %} |
| 83 | + |
| 84 | +Once your spark is connected to a {% data variables.product.github %} repository, you can use {% data variables.copilot.copilot_coding_agent %} to help you to continue to build and maintain your app, while you focus on other things. |
| 85 | + |
| 86 | +With the coding agent, you delegate specific tasks to {% data variables.product.prodname_copilot_short %} (either by assigning an issue to {% data variables.product.prodname_copilot_short %}, or prompting {% data variables.product.prodname_copilot_short %} to create a pull request), and {% data variables.product.prodname_copilot_short %} will autonomously work in the background to complete the task. {% data variables.copilot.copilot_coding_agent %} can fix bugs, refactor code, improve test coverage and more. |
| 87 | + |
| 88 | +See [AUTOTITLE](/copilot/concepts/agents/coding-agent/about-coding-agent). |
| 89 | + |
| 90 | +## Sharing your spark |
| 91 | + |
| 92 | +When you're ready to publish your spark, you can choose from the following visibility options: |
| 93 | + |
| 94 | +* Private to you only |
| 95 | +* Visible to members of a specific organization on {% data variables.product.github %} |
| 96 | +* Visible to all {% data variables.product.github %} users. |
| 97 | + |
| 98 | +You can then share your spark with others, so they can view and interact with your app. The link to your spark remains undiscoverable except for those who have the link. |
| 99 | + |
| 100 | +Optionally, you can publish your spark as "read-only", meaning you can showcase your app to others without them being able to edit or delete app contents. |
| 101 | + |
| 102 | +## Limitations of {% data variables.product.prodname_spark_short %} |
| 103 | + |
| 104 | +{% data variables.product.prodname_spark_short %} uses an opinionated stack (React, TypeScript) for reliability. For best results, you should work within {% data variables.product.prodname_spark_short %}'s SDK and core framework. |
| 105 | + |
| 106 | +You can add external libraries, but compatibility with {% data variables.product.prodname_spark_short %}'s SDK isn’t guaranteed. You should always test your spark thoroughly after adding any external libraries. |
| 107 | + |
| 108 | +By default, your spark's data store is shared for all users of the published spark. You should make sure to delete any private or sensitive data from your app prior to making it visible to other users. You can optionally publish your spark as "read-only", meaning you can showcase your app to others without them being able to edit or delete app contents. |
| 109 | + |
| 110 | +## Further reading |
| 111 | + |
| 112 | +* [AUTOTITLE](/copilot/responsible-use/spark) |
| 113 | +* [AUTOTITLE](/copilot/tutorials/spark/build-apps-with-spark) |
| 114 | +* [AUTOTITLE](/copilot/how-tos/troubleshoot-copilot/troubleshoot-spark) |
0 commit comments