GitHub API Integration Guide: REST API, OAuth, Apps & Webhooks (2026)

The GitHub REST API gives you programmatic access to repositories, issues, pull requests, users, and webhooks — but before you write a single API call, you need to make the right authentication decision. Choose the wrong one and you'll either hit per-user rate limits at scale or spend weeks rebuilding your auth layer.

Quick answer: For production product integrations, use GitHub Apps — they authenticate at the installation level (not per-user), receive 15,000 API requests/hour per installation, and support fine-grained permissions. Use OAuth Apps when you need to act as the user. Use Personal Access Tokens for scripts and one-off automation only.

This guide covers everything you need to build a complete GitHub API integration: authentication setup for all three methods, REST API endpoints for issues, repos, users, and labels, webhook configuration with signature verification, rate limits, and three real-world integration patterns with working Python code.

If your product needs to support GitHub alongside other issue trackers like Jira, Linear, or Asana, there's a unified approach worth knowing about — covered in the Building with Knit section.

The GitHub API: REST, GraphQL, and Webhooks

GitHub exposes three API surfaces. Understanding which to use before you start building saves significant refactoring later.

What you want to do Recommended approach
Read/write issues, PRs, repos, users REST API (api.github.com)
Fetch deeply nested data in one request GraphQL API (api.github.com/graphql)
React to events in real time (push, PR opened, issue created) Webhooks
Get notified about events without running a server GitHub Apps + webhook delivery
Two-way sync with GitHub events Webhooks + REST API
CLI scripts and one-off automation Personal Access Token + REST API

REST API is the right choice for the vast majority of product integrations. The GraphQL API is useful when you need to fetch nested relationships (issues with their labels, assignees, and comments) in a single query and want to avoid over-fetching. Webhooks are event-driven and complement REST — they notify your server when something happens, then you call REST to get full details.

The GitHub REST API base URL is https://api.github.com. All endpoints accept and return JSON. The API version is specified via the X-GitHub-Api-Version header — always pin this to avoid breaking changes:

GET /repos/{owner}/{repo}/issues
Authorization: Bearer {token}
Accept: application/vnd.github+json
X-GitHub-Api-Version: 2022-11-28

Authentication: GitHub Apps vs OAuth Apps vs Personal Access Tokens

This is the most consequential decision in any GitHub integration. Here's what each approach actually means for a production system:

GitHub Apps OAuth Apps Personal Access Tokens
Authenticates as The app (installation-level) The user The user
Rate limit 15,000 req/hr per installation 5,000 req/hr per user 5,000 req/hr
Token expiry Installation tokens expire in 1 hour No expiry (until revoked) No expiry (until revoked)
Permission model Fine-grained (repository-level) Broad OAuth scopes Broad or fine-grained scopes
Best for Production integrations User-acting flows Scripts, CI/CD, personal automation
Multi-tenant ✅ Yes — one app, many installations ✅ Yes — per-user OAuth flow ❌ No — tied to a single user
Webhooks Built-in, per-installation Separate setup Separate setup

Option 1: GitHub Apps (Recommended for Product Integrations)

GitHub Apps is the most powerful option and the right default for any B2B product integration. A GitHub App is installed on an organization or repository, not tied to a user account, and generates short-lived installation tokens.

Step 1: Register a GitHub App

Go to Settings → Developer settings → GitHub Apps → New GitHub App. Key fields:

  • Webhook URL: your server's endpoint for incoming events
  • Permissions: select only what you need (Issues: Read & Write, Metadata: Read)
  • Where can this GitHub App be installed? → Any account (for multi-tenant products)

GitHub generates a private key (.pem file) and an App ID. Store both securely.

Step 2: Generate a JWT

import jwt
import time
from pathlib import Path

def generate_github_jwt(app_id: str, private_key_path: str) -> str:
    private_key = Path(private_key_path).read_text()
    payload = {
        "iat": int(time.time()) - 60,       # Issued at (60s buffer for clock skew)
        "exp": int(time.time()) + (10 * 60), # Expires in 10 minutes (max)
        "iss": app_id
    }
    return jwt.encode(payload, private_key, algorithm="RS256")

Step 3: Exchange the JWT for an Installation Token

import requests

def get_installation_token(jwt_token: str, installation_id: str) -> str:
    """
    Installation tokens expire after 1 hour.
    Cache and refresh them before expiry in production.
    """
    response = requests.post(
        f"https://api.github.com/app/installations/{installation_id}/access_tokens",
        headers={
            "Authorization": f"Bearer {jwt_token}",
            "Accept": "application/vnd.github+json",
            "X-GitHub-Api-Version": "2022-11-28"
        }
    )
    data = response.json()
    return data["token"]  # This is your installation access token

The installation token is used exactly like any other Bearer token for subsequent API calls. Because it expires in 1 hour, build a caching layer that refreshes tokens 5 minutes before expiry.

Step 4: Redirect users to install your GitHub App

https://github.com/apps/{app-name}/installations/new

After installation, GitHub redirects to your callback URL with an installation_id. Store this per-customer in your database.

If you're building a product that needs to support GitHub alongside other issue trackers — Jira, Linear, Asana — managing GitHub Apps installation tokens per customer, while also handling different auth flows for every other tool, quickly becomes a significant engineering overhead. Knit handles GitHub auth (OAuth and PAT) and normalises the API surface across all your supported ticketing tools, so you write the integration once. See getknit.dev/integration/github.

Option 2: OAuth Apps (User-Acting Flows)

Use OAuth Apps when your integration needs to act as the user — for example, creating issues on behalf of the authenticated user, or reading private repos the user has access to.

OAuth Flow:

# Step 1: Redirect user to GitHub
auth_url = (
    "https://github.com/login/oauth/authorize"
    f"?client_id={CLIENT_ID}"
    f"&redirect_uri={REDIRECT_URI}"
    f"&scope=repo,read:user"
    f"&state={generate_csrf_token()}"  # Always validate state to prevent CSRF
)

# Step 2: Exchange code for token (after redirect back)
def exchange_code_for_token(code: str) -> str:
    response = requests.post(
        "https://github.com/login/oauth/access_token",
        data={
            "client_id": CLIENT_ID,
            "client_secret": CLIENT_SECRET,
            "code": code
        },
        headers={"Accept": "application/json"}
    )
    return response.json()["access_token"]

OAuth App tokens do not expire automatically, but users can revoke them at any time. Build webhook listeners for the github_app_authorization event to detect revocations and clean up stored tokens accordingly.

Option 3: Personal Access Tokens

PATs are the simplest option — generate one in Settings → Developer settings → Personal access tokens — but they're fundamentally single-user. All API calls are attributed to the token owner, which creates audit and attribution problems in multi-tenant products. Use PATs for CI/CD pipelines, internal automation, and developer tooling only.

Fine-grained PATs (currently in beta) allow scoping to specific repositories and actions, making them a reasonable choice for tightly controlled automation scenarios.

Key GitHub REST API Endpoints

Issues

Issues are the core resource for most GitHub integrations. GitHub's Issues API also returns pull requests — always check for the pull_request field if you want to exclude PRs.

List issues in a repository:

def list_issues(owner: str, repo: str, token: str, state: str = "open") -> list:
    """
    Returns up to 100 issues per page.
    Iterate Link headers for full pagination.
    pull_request field is present on PRs — filter if needed.
    """
    issues = []
    url = f"https://api.github.com/repos/{owner}/{repo}/issues"
    params = {"state": state, "per_page": 100}
    headers = {
        "Authorization": f"Bearer {token}",
        "Accept": "application/vnd.github+json",
        "X-GitHub-Api-Version": "2022-11-28"
    }

    while url:
        response = requests.get(url, params=params, headers=headers)
        response.raise_for_status()
        issues.extend([i for i in response.json() if "pull_request" not in i])
        
        # GitHub returns pagination via Link header
        link_header = response.headers.get("Link", "")
        url = extract_next_url(link_header)  # Parse rel="next" from header
        params = {}  # Next URL already includes params

    return issues

Create an issue:

def create_issue(owner: str, repo: str, token: str,
                 title: str, body: str, labels: list = None,
                 assignees: list = None) -> dict:
    response = requests.post(
        f"https://api.github.com/repos/{owner}/{repo}/issues",
        headers={
            "Authorization": f"Bearer {token}",
            "Accept": "application/vnd.github+json",
            "X-GitHub-Api-Version": "2022-11-28"
        },
        json={
            "title": title,
            "body": body,
            "labels": labels or [],
            "assignees": assignees or []
        }
    )
    response.raise_for_status()
    return response.json()  # Returns full issue object including issue number and URL

Update an issue (assign, label, close):

def update_issue(owner: str, repo: str, issue_number: int, token: str, **fields) -> dict:
    """
    Supports: title, body, state (open/closed), labels, assignees, milestone.
    """
    response = requests.patch(
        f"https://api.github.com/repos/{owner}/{repo}/issues/{issue_number}",
        headers={
            "Authorization": f"Bearer {token}",
            "Accept": "application/vnd.github+json",
            "X-GitHub-Api-Version": "2022-11-28"
        },
        json=fields
    )
    response.raise_for_status()
    return response.json()

Repositories

# List repositories for an organization
GET /orgs/{org}/repos?type=all&per_page=100

# Get a specific repository
GET /repos/{owner}/{repo}

# List repository collaborators
GET /repos/{owner}/{repo}/collaborators

Users and Members

# Get the authenticated user
GET /user

# Get a user by username
GET /users/{username}

# List organization members
GET /orgs/{org}/members

Labels and Milestones

# List all labels in a repository
GET /repos/{owner}/{repo}/labels

# Create a label
POST /repos/{owner}/{repo}/labels
Body: {"name": "bug", "color": "d73a4a", "description": "Something isn't working"}

# List milestones
GET /repos/{owner}/{repo}/milestones?state=open

Webhooks: Real-Time Event Handling

Webhooks let GitHub push events to your server rather than requiring you to poll the API. Configure them in repository or organization settings, or programmatically via the API.

Create a webhook via the API:

def create_webhook(owner: str, repo: str, token: str,
                   payload_url: str, secret: str, events: list) -> dict:
    response = requests.post(
        f"https://api.github.com/repos/{owner}/{repo}/hooks",
        headers={
            "Authorization": f"Bearer {token}",
            "Accept": "application/vnd.github+json",
            "X-GitHub-Api-Version": "2022-11-28"
        },
        json={
            "name": "web",
            "active": True,
            "events": events,  # e.g. ["issues", "pull_request", "push"]
            "config": {
                "url": payload_url,
                "content_type": "json",
                "secret": secret,
                "insecure_ssl": "0"
            }
        }
    )
    response.raise_for_status()
    return response.json()

Verifying Webhook Signatures

Every GitHub webhook payload includes an X-Hub-Signature-256 header. You must verify this on every incoming request — skip this step and your endpoint can be spoofed by anyone who discovers its URL.

import hmac
import hashlib
from flask import Flask, request, abort

app = Flask(__name__)
WEBHOOK_SECRET = b"your-webhook-secret"

@app.route("/webhook/github", methods=["POST"])
def handle_github_webhook():
    # Verify signature before processing anything
    signature_header = request.headers.get("X-Hub-Signature-256", "")
    if not signature_header.startswith("sha256="):
        abort(400, "Missing or malformed signature")

    expected_sig = hmac.new(
        WEBHOOK_SECRET,
        request.data,           # Raw bytes — don't use parsed JSON here
        hashlib.sha256
    ).hexdigest()

    received_sig = signature_header[7:]  # Strip "sha256=" prefix

    # Constant-time comparison prevents timing attacks
    if not hmac.compare_digest(expected_sig, received_sig):
        abort(401, "Invalid signature")

    # Safe to process the payload now
    payload = request.json
    event_type = request.headers.get("X-GitHub-Event")

    if event_type == "issues":
        handle_issue_event(payload)
    elif event_type == "pull_request":
        handle_pr_event(payload)

    return "", 200

def handle_issue_event(payload: dict):
    action = payload["action"]  # opened, closed, labeled, assigned, etc.
    issue = payload["issue"]
    repo = payload["repository"]

    if action == "opened":
        print(f"New issue #{issue['number']} in {repo['full_name']}: {issue['title']}")

Supported webhook events for issue integrations: issues, issue_comment, label, milestone, pull_request, push, repository.

GitHub retries failed webhook deliveries with exponential backoff for up to 72 hours. Return a 200 response immediately on receipt and process the payload asynchronously to avoid delivery timeouts (GitHub expects a response within 10 seconds).

Authentication method Requests per hour Search API
Unauthenticated 60 10/min
OAuth App / PAT 5,000 30/min
GitHub App (installation token) 15,000 30/min
GitHub App (user token) 5,000 30/min

Rate limit status is returned in every response:

X-RateLimit-Limit: 5000
X-RateLimit-Remaining: 4823
X-RateLimit-Reset: 1747353600   # Unix timestamp when the limit resets
X-RateLimit-Used: 177

When X-RateLimit-Remaining reaches 0, GitHub returns 403 Forbidden with a Retry-After header. Build rate limit handling into your HTTP client from the start:

def github_request(url: str, token: str, **kwargs) -> requests.Response:
    response = requests.get(url, headers={
        "Authorization": f"Bearer {token}",
        "Accept": "application/vnd.github+json",
        "X-GitHub-Api-Version": "2022-11-28"
    }, **kwargs)

    if response.status_code == 403 and "X-RateLimit-Remaining" in response.headers:
        if response.headers["X-RateLimit-Remaining"] == "0":
            reset_time = int(response.headers["X-RateLimit-Reset"])
            wait = max(0, reset_time - int(time.time())) + 5  # 5s buffer
            time.sleep(wait)
            return github_request(url, token, **kwargs)  # Retry

    response.raise_for_status()
    return response

For secondary rate limits (triggered by too many concurrent requests), watch for Retry-After in the response headers and honor it exactly.

3 Common GitHub Integration Patterns

Pattern 1: Sync Issues from GitHub to Your Product

The most common integration pattern: pull issues from one or more GitHub repos and display or sync them inside your product.

import requests
import time

def sync_all_issues(installations: list, token_manager) -> list:
    """
    Full issue sync across multiple repositories.
    Returns a normalised list of issues for storage.
    """
    all_issues = []

    for installation in installations:
        token = token_manager.get_token(installation["id"])  # Cached + auto-refreshed

        for repo in installation["repos"]:
            owner, name = repo["owner"], repo["name"]
            page_url = f"https://api.github.com/repos/{owner}/{name}/issues"
            params = {"state": "all", "per_page": 100}

            while page_url:
                resp = requests.get(page_url, params=params, headers={
                    "Authorization": f"Bearer {token}",
                    "Accept": "application/vnd.github+json",
                    "X-GitHub-Api-Version": "2022-11-28"
                })
                resp.raise_for_status()

                for issue in resp.json():
                    if "pull_request" in issue:
                        continue  # Skip PRs

                    all_issues.append({
                        "id": issue["number"],
                        "title": issue["title"],
                        "state": issue["state"],
                        "assignees": [a["login"] for a in issue["assignees"]],
                        "labels": [l["name"] for l in issue["labels"]],
                        "url": issue["html_url"],
                        "created_at": issue["created_at"],
                        "updated_at": issue["updated_at"],
                        "repo": f"{owner}/{name}"
                    })

                # Parse next page from Link header
                link = resp.headers.get("Link", "")
                next_url = next(
                    (p.split(";")[0].strip("<>") for p in link.split(",")
                     if 'rel="next"' in p), None
                )
                page_url = next_url
                params = {}

    return all_issues

Pattern 2: Create Issues from Your Product

When a user creates a task in your product and wants it to appear in GitHub:

def create_github_issue_from_task(task: dict, repo_config: dict, token: str) -> dict:
    """
    Maps your product's task model to a GitHub issue.
    Returns the created issue with GitHub's issue number for cross-referencing.
    """
    # Map your assignees to GitHub usernames
    github_assignees = [
        repo_config["user_mapping"].get(uid)
        for uid in task.get("assignee_ids", [])
        if repo_config["user_mapping"].get(uid)
    ]

    # Map your labels/tags to GitHub label names
    github_labels = [
        repo_config["label_mapping"].get(tag)
        for tag in task.get("tags", [])
        if repo_config["label_mapping"].get(tag)
    ]

    response = requests.post(
        f"https://api.github.com/repos/{repo_config['owner']}/{repo_config['repo']}/issues",
        headers={
            "Authorization": f"Bearer {token}",
            "Accept": "application/vnd.github+json",
            "X-GitHub-Api-Version": "2022-11-28"
        },
        json={
            "title": task["title"],
            "body": f"{task['description']}\n\n---\n*Created via {task['source']}*",
            "assignees": github_assignees,
            "labels": github_labels,
            "milestone": repo_config.get("milestone_id")
        }
    )
    response.raise_for_status()
    github_issue = response.json()

    # Store the GitHub issue number in your database for future updates
    return {
        "github_issue_number": github_issue["number"],
        "github_issue_url": github_issue["html_url"],
        "github_issue_id": github_issue["id"]
    }

Pattern 3: Bidirectional Status Sync via Webhooks

Keep issue state in sync in real time — when a GitHub issue is closed, close the linked item in your product; and vice versa.

# Webhook handler (GitHub → your product)
def handle_issue_state_change(payload: dict):
    action = payload["action"]

    if action not in ("closed", "reopened"):
        return  # Only care about state changes

    github_issue_id = payload["issue"]["id"]
    new_state = "closed" if action == "closed" else "open"

    # Look up the linked task in your DB
    task_id = db.get_task_by_github_id(github_issue_id)
    if task_id:
        db.update_task_state(task_id, new_state)
        print(f"Synced GitHub issue {github_issue_id} → Task {task_id}: {new_state}")


# REST handler (your product → GitHub)
def close_github_issue_for_task(task_id: str, token: str):
    github_info = db.get_github_info_for_task(task_id)
    if not github_info:
        return

    update_issue(
        owner=github_info["owner"],
        repo=github_info["repo"],
        issue_number=github_info["issue_number"],
        token=token,
        state="closed"
    )

Building GitHub Integrations with Knit

GitHub Apps auth — JWTs, per-installation tokens that expire hourly, managing token refresh across multiple customer installations — is the part of a GitHub integration that adds the most engineering overhead for the least user-visible value.

Knit provides a unified ticketing API that handles GitHub authentication (OAuth and Personal Access Token flows) for your customers. Instead of building and maintaining the OAuth consent flow, token storage, and refresh logic, your customers connect their GitHub account once through Knit's auth layer. You call Knit's normalised endpoints using a single set of headers:

Authorization: Bearer {your-knit-api-token}
X-Knit-Integration-Id: {customer-integration-id}

This is particularly valuable if your product supports GitHub alongside other issue trackers — Jira, Linear, Asana, Zendesk, and more are all available through the same Knit interface, so you build the integration pattern once and it works across all of them.

The Knit APIs available for GitHub:

Knit API Endpoint Maps to in GitHub Use cases
Get Accounts GET /ticketing/accounts GitHub organisations List all orgs a user belongs to; populate org picker
Get Account By Id GET /ticketing/account?accountId= Single GitHub organisation Fetch org details for a specific installation
Get Users GET /ticketing/users?accountId= GitHub users in an org Build user directory; populate assignee dropdown. Note: GitHub does not return user emails via this endpoint
Get User By Id GET /ticketing/user?userId= Single GitHub user Look up a specific user by ID
Get Groups GET /ticketing/groups?accountId= GitHub teams List teams in an org; map to internal groups or access levels
Get Group By Id GET /ticketing/group?groupId= Single GitHub team Fetch team membership for access control
Get Tags GET /ticketing/tags?accountId=&collectionId= GitHub labels on a repo Sync labels for filtering and categorisation

Example: fetch all teams in a GitHub org via Knit

import requests

def get_github_teams_via_knit(knit_token: str, integration_id: str,
                               account_id: str) -> list:
    """
    Returns GitHub teams for the given org (account_id).
    No JWT generation, no installation tokens, no token refresh logic.
    """
    response = requests.get(
        "https://api.getknit.dev/v1.0/ticketing/groups",
        headers={
            "Authorization": f"Bearer {knit_token}",
            "X-Knit-Integration-Id": integration_id
        },
        params={"accountId": account_id}
    )
    response.raise_for_status()
    data = response.json()

    # Cursor-based pagination built in
    groups = data["data"]["groups"]
    next_cursor = data["data"]["pagination"].get("next")

    while next_cursor:
        response = requests.get(
            "https://api.getknit.dev/v1.0/ticketing/groups",
            headers={
                "Authorization": f"Bearer {knit_token}",
                "X-Knit-Integration-Id": integration_id
            },
            params={"accountId": account_id, "cursor": next_cursor}
        )
        page = response.json()
        groups.extend(page["data"]["groups"])
        next_cursor = page["data"]["pagination"].get("next")

    return groups

→ See the full GitHub integration on Knit: getknit.dev/integration/github

→ Knit's ticketing API docs: developers.getknit.dev

What to Build First

If you're building a GitHub integration from scratch, this is the order that minimises rework:

  1. Register your GitHub App and generate your private key — do this before writing any API code. Your App ID and private key are required for every subsequent step.
  2. Build your installation token manager — a simple class that generates JWTs, exchanges them for installation tokens, and caches tokens until 5 minutes before expiry. Every other part of your integration depends on this.
  3. Implement the OAuth installation flow — redirect users to install your app, capture the installation_id on callback, and store it per customer.
  4. Set up your webhook endpoint with signature verification — register it while creating the GitHub App. Getting verification right from day one prevents security issues later.
  5. Implement the issues endpoints — list, create, and update. These cover 80% of typical GitHub product integrations.
  6. Build your user mapping layer — fetch org members and map them to your product's user identifiers. GitHub users don't expose email addresses via most endpoints, so login (username) is your reliable identifier.
  7. Add label and milestone sync — fetch these once on installation and cache them; they change infrequently.
  8. Wire up bidirectional status sync — close/reopen issues in response to both webhook events (GitHub → your product) and user actions (your product → GitHub).

Summary

Topic Key fact
Recommended auth GitHub Apps for production; OAuth Apps for user-acting flows; PATs for scripts only
Installation token lifetime Expires after 1 hour — build a refresh mechanism
Rate limit (GitHub Apps) 15,000 req/hr per installation
Rate limit (OAuth/PAT) 5,000 req/hr
Issues endpoint GET /repos/{owner}/{repo}/issues — includes PRs, use pull_request field to filter
Webhook verification HMAC-SHA256 via X-Hub-Signature-256, constant-time comparison required
API version header Always send X-GitHub-Api-Version: 2022-11-28
Multi-integration shortcut Knit handles GitHub auth (OAuth/PAT) and normalises across Jira, Linear, Asana, and more

Frequently Asked Questions

What is the difference between GitHub Apps, OAuth Apps, and Personal Access Tokens?

GitHub Apps are the recommended approach for building integrations — they authenticate as the app itself, support fine-grained permissions, and receive 15,000 API requests/hour per installation. OAuth Apps authenticate as a user and are limited to the user's rate limit of 5,000 requests/hour. Personal Access Tokens are best for scripts and automation where a single user account controls access, but they do not scale across multiple users.

How do I authenticate with the GitHub REST API?

Pass your token in the Authorization header: Authorization: Bearer {token}. For GitHub Apps, generate a JWT signed with your app's private key, then exchange it for an installation access token via POST /app/installations/{installation_id}/access_tokens. For OAuth Apps and PATs, pass the token directly. Unauthenticated requests are limited to 60 requests per hour; authenticated requests get 5,000 per hour.

What are the GitHub REST API rate limits?

Unauthenticated requests: 60 per hour. Authenticated OAuth Apps and PATs: 5,000 requests per hour. GitHub Apps using installation tokens: 15,000 requests per hour per installation. Search API requests: 30 per minute for authenticated users, 10 per minute for unauthenticated. Rate limit status is returned on every response via X-RateLimit-Remaining and X-RateLimit-Reset headers.

How do GitHub webhooks work?

GitHub webhooks send HTTP POST payloads to a URL you configure whenever a subscribed event occurs. Every payload includes an X-Hub-Signature-256 header — an HMAC-SHA256 signature of the raw request body using your webhook secret. You must verify this signature on every incoming request. GitHub delivers at most one webhook per event and retries for up to 72 hours on delivery failure.

How do I list all issues from a GitHub repository via the API?

Use GET /repos/{owner}/{repo}/issues. By default this returns open issues and pull requests. Filter with state=open, state=closed, or state=all. Use labels, assignee, and milestone query params to narrow results. Results are paginated at 30 items per page by default — use per_page (max 100) and the Link response header to navigate pages. Pull requests are included in the issues endpoint; filter them out by checking for the pull_request field.

What is the difference between the GitHub REST API and GraphQL API?

The GitHub REST API has separate endpoints per resource and is the standard choice for most integrations. The GitHub GraphQL API (v4) lets you request exactly the fields you need in a single query, reducing over-fetching. Use REST when building straightforward CRUD integrations. Use GraphQL when you need to fetch deeply nested relationships — issues with their comments, labels, and assignees — in a single request.

How do I verify a GitHub webhook signature?

Compute HMAC-SHA256 of the raw request body using your webhook secret as the key. Compare this digest to the value in the X-Hub-Signature-256 header (prefixed with sha256=). Use a constant-time comparison function (like hmac.compare_digest in Python) to prevent timing attacks. Never process webhook payloads before verifying the signature.

Is there a simpler way to integrate GitHub without managing OAuth or GitHub Apps authentication myself?

Yes. Knit provides a unified ticketing API that handles GitHub authentication (OAuth and PAT) for you. Instead of implementing the OAuth flow, managing token storage, or dealing with per-user credentials, your customers connect their GitHub account once through Knit's auth layer. You then call Knit's normalised endpoints — for organisations, users, teams, and labels — without writing auth infrastructure. This is especially useful if you also need to support Jira, Linear, or Asana alongside GitHub, as Knit's same API surface covers all of them. → getknit.dev/integration/github

#1 in Ease of Integrations

Trusted by businesses to streamline and simplify integrations seamlessly with GetKnit.