GitHub's May 13 Agent API Just Made 346 Engineers Sourceable
GitHub shipped the Copilot Cloud Agent REST API on May 13. Here is how to turn the 346 awesome-copilot contributors into a named AI-native pipeline.
On May 13, 2026, GitHub shipped the Copilot Cloud Agent REST API in public preview. Two days earlier, GM cut 600 IT workers and said out loud what every Fortune 500 hiring manager has been hinting at for six months: they want engineers who build agents and MCP workflows, not engineers who use Copilot. The pipeline for that work is already public, already enumerable, and most of your competitors do not know it exists yet.
What actually shipped on May 13
The Agent tasks REST API lets Copilot Business and Copilot Enterprise customers programmatically start Copilot cloud agent tasks. The endpoint lives at api.github.com/agents/repos/OWNER/REPO/tasks against API version 2026-03-10. The use cases GitHub called out in the changelog are not personal productivity. They are fleet operations: fan out refactors or migrations across many repositories from a script, set up new repositories in one click from an internal developer portal, automatically prepare a weekly release with notes.
This matters for sourcing because it changes the job. Before May 13, "uses Copilot" was a personal claim that meant nothing on a resume. After May 13, enterprise buyers need engineers who can build automations on top of cloud agents. The companion org-management API shipped March 24, 2026, completing the enterprise control plane. Support for GitHub App installation tokens and Copilot Pro access is still listed as "coming soon," which tells you exactly who the early demand pool is: enterprise, paying, and short on builders.
The skills GM is hiring for, by name
GM was explicit. They are hiring for AI-native development, data engineering, model training, agent development, prompt engineering, and cloud architecture. They have already brought in Behrad Toghi from Apple and Rashed Haq from Cruise to lead the AI and autonomous vehicle work. The justification line they keep repeating: nearly 90 percent of the code for GM's autonomous driving software is now written by AI.
An automaker in Warren, Michigan is not the obvious first choice for a machine learning engineer choosing between offers from Google, Nvidia, and Apple. GM knows this. So does every other Fortune 500 that needs the same skills swap. The recruiters who win the next 12 to 18 months are the ones sourcing by Git history, not by waiting on LinkedIn search to surface "AI-native" titles that mostly do not exist yet.
Why awesome-copilot is the cleanest signal on the open web
The github/awesome-copilot repo is the canonical community catalog: instructions, agents, skills, and configurations for getting the most out of GitHub Copilot. It sits at roughly 31K stars and 346 contributors as of this week. The numbers move weekly, because contributor attribution is automated via the contributors.yml workflow, which runs on a cron, checks for missing contributors using all-contributors, runs contributor-report.mjs to infer contribution types from file paths, and opens a PR with updated attribution.
Translation for sourcers: the repo tells you not just who contributed, but what kind of primitive they shipped, with file-path precision. That is the difference between "shows up on a Copilot LinkedIn search" and "merged an agent with an MCP server dependency into the canonical Microsoft-blessed catalog."
The six primitive folders map cleanly to different reqs:
/agentswith MCP frontmatter: tool-using-LLM engineers. The agent declares its MCP server dependencies inline, and the config gets merged into.vscode/mcp.jsonon install. These are the people GM and every enterprise actually want./hooks: event-driven workflow engineers. Closer to platform and DevEx./skillsand/instructions: prompt and instruction designers. Adjacent to applied AI and DX roles./workflowsand/plugins: integration and tooling engineers.
If you source the repo as one bucket, you are wasting the signal. Slice by folder and you are handing a hiring manager a pre-segmented shortlist that matches the job description they actually wrote.
The LinkedIn problem this fixes
A keyword scan of US profiles for the combination "MCP + agent + copilot" returns under ten strong matches today, and most of them cluster as "Technical Architect, AI/GenAI" or "Senior AI Engineer, GenAI" at Optum, CGI, Macy's, and TCS. Those are enterprise SI consultants, not the builders shipping agents into public repos. The LinkedIn supply for this skill set is currently dominated by titles that sound right and commits that do not exist.
This is the gap Refolk was built to close. You describe the engineer in plain English ("contributors to github/awesome-copilot whose merged PRs touched /agents with MCP server frontmatter, based in North America, not currently at FAANG") and get a ranked shortlist with the GitHub handle, the LinkedIn profile, and the contribution evidence stitched together. The contribution proof is the part LinkedIn cannot show you and the part your hiring manager will actually read.
A repo-by-repo playbook for the next 90 days
1. Pull the contributor list from the live README
The all-contributors workflow keeps the README contributor section current. Names like Michael A. Volz (Flynn) appear in the contributor table with avatars and the primitive types they touched. Export the section, deduplicate against your ATS, and you have a starter list of 346 candidates who do not yet know they are being sourced.
2. Bucket by primitive folder
Run a git log filter per folder and attribute contributors to the primitive they actually shipped. The contributor-report.mjs script in the repo already does file-path inference; you can mirror its logic or just read its output. A contributor with three merged PRs in /agents and zero anywhere else is a different hire than a contributor with eight commits across /skills and /instructions.
3. Cross-reference against the MCP Registry working group
The official MCP Registry cross-company working group includes lead maintainer David Soria Parra and registry maintainers from Anthropic, GitHub, and PulseMCP, with contributors from Block and Microsoft. Anyone whose name appears in both awesome-copilot and the MCP Registry maintainer set is the senior tier. They build agents and they publish the servers those agents call. That overlap is the cohort hiring managers will pay above market for, and it is small enough to enumerate by hand.
4. Add punkpeye/awesome-mcp-servers as a second list
The same logic applies to the broader MCP server catalog. Contributors who appear in both repos have demonstrated they can build the tool and the tool-user. That is the full-stack agentic AI engineer recruiting profile every job description is now trying to describe in 400 words of buzzwords.
5. Watch the Microsoft Awesome Copilot MCP Server team
Microsoft shipped its own MCP server that sits on top of awesome-copilot, letting users search community customizations and save them directly into a repo. The engineers behind that server are themselves AI-native talent, named in the developer.microsoft.com announcement and reachable through their public GitHub history.
After May 13, "uses Copilot" stops being a resume claim. Either you shipped an agent into a public repo or you did not.
What to actually say in the first message
Sourcing the list is half the work. The outreach has to acknowledge the specific primitive the person shipped, not gesture at "your impressive GitHub." If you contacted a contributor about their /hooks PR by referencing the agent they did not write, you have already lost the reply. The good news is that public PRs come with a title, a description, and often a linked issue. Lead with that. Mention the primitive, mention the merge date, and explain why the role you are filling needs exactly that kind of work.
This is also where Refolk's evidence layer pays for itself, because the shortlist comes with the actual PR URL attached. You are not pasting "I saw your GitHub." You are pasting "I saw your November PR adding the Postgres MCP server agent with the pg_dump skill, and we have an open req at a logistics company that wants exactly that fanout pattern across 400 internal repos."
The window, and who closes it
The 346 number is going to move. The contributors.yml cron runs weekly, the repo keeps adding primitives, and every week the early-mover advantage shrinks. Right now, three things are simultaneously true:
- GM has publicly named the skill set and is hiring against it.
- The REST API that operationalizes the skill set just shipped.
- The talent pool of public, verifiable builders is small enough to fit on one screen.
Twelve months from now, every recruiter at every Fortune 500 will be searching the same repo. The contributors who shipped in 2025 and early 2026 will be in-seat, vested, and harder to move. If your hiring manager has an "AI-native engineer" req open this quarter, the awesome-copilot repo sourcing pass is the highest-yield two days of work on your calendar. The senior dev pool may distrust AI tooling, but the people in that 346 chose to build it in public. That is the screen.
If you want the list segmented by folder, employer, and recency without writing the scripts yourself, Refolk is the fastest path. Plain English query in, named shortlist with PR evidence out, across GitHub, LinkedIn, and the open web in one pass.
FAQ
Why source awesome-copilot specifically instead of just searching GitHub for "MCP"?
Generic MCP search on GitHub returns thousands of forks, tutorials, and toy projects. The github/awesome-copilot repo is curated, attribution is automated, and inclusion requires a merged PR reviewed by the GitHub team. The signal-to-noise ratio is roughly 100x higher, and the contributor list is enumerable rather than infinite. For sourcing AI-native engineers, curation is the feature.
Are awesome-copilot contributors actually hireable, or are they all at Microsoft and GitHub already?
A meaningful share are Microsoft and GitHub employees, and you should filter those out unless you can credibly poach. The remainder spans community contributors at startups, consultancies, and adjacent platform companies. The cross-reference with the MCP Registry working group and punkpeye/awesome-mcp-servers surfaces the independent builders. Expect roughly half the list to be movable; that is still a 150-name pool for a skill that returns under ten strong matches on LinkedIn keyword search.
How do I tell a real agent contributor from someone who fixed a typo?
Look at the file path, not the commit count. A contributor with one merged PR in /agents that declares MCP server dependencies in frontmatter has demonstrated more relevant skill than a contributor with twelve README fixes. The contributor-report.mjs logic in the repo infers contribution type from file path for exactly this reason. Replicate it, or use a tool that already has.
What happens to this playbook when GitHub locks down the API to enterprise only?
The Agent tasks REST API is already gated to Copilot Business and Enterprise, with Copilot Pro access listed as coming soon. That gating affects who can use the API at scale, not who can contribute to the public awesome-copilot repo. The contributor list stays public regardless. If anything, tighter enterprise gating increases demand for the builders who can operate inside it, which means the sourcing window gets more valuable, not less.