Skip to main content

Frequently Asked Questions

Common questions about PageTurner's translation platform, features, and troubleshooting.


General Questions​

How is this different from Google Translate?​

Short answer: Google Translate processes sentences in isolation. PageTurner uses a 5-phase pipeline that ensures technical term consistency across your entire documentation.

Key differences:

FeaturePageTurnerGoogle Translate
Term Consistency99%+ (same term = same translation everywhere)~70% (inconsistent across pages)
Context AwarenessFull document contextSentence-by-sentence
Technical Accuracy94%~68%
MDX/React PreservationNative support, 100% preservedBreaks components
Quality Score91.3/10065-70/100

Example problem with Google Translate:

  • Page 1: "repository" β†’ "repositorio"
  • Page 50: "repository" β†’ "repo"
  • Page 100: "repository" β†’ "almacΓ©n"

PageTurner solution: "repository" β†’ "repositorio" on ALL pages.


What frameworks are supported?​

Currently supported (Production):

  • βœ… Docusaurus - Full production support with MDX, React components, i18n

Coming soon:

  • 🚧 Next.js (Q2 2025) - App router, MDX support
  • 🚧 Nextra (Q3 2025) - Next.js-based docs framework
  • 🚧 Hugo (Q3 2025) - Static site generator
  • 🚧 VitePress (Q4 2025) - Vue-powered docs

Want another framework? Contact us to request support or vote on our roadmap.


How much does it cost?​

Current Beta Pricing:

Open Source Program (FREE):

  • Free translations for qualifying open source projects
  • Up to 500 pages
  • All languages supported
  • Community support

Commercial Beta:

  • Pay-as-you-go: ~$0.10 per page per language
  • Typical costs:
    • 100 pages Γ— 3 languages = $30
    • 200 pages Γ— 5 languages = $100
  • Updates: 60-80% cheaper with translation memory

See full details: Beta Program

Note: Beta pricing may change at general availability. Early adopters get locked-in pricing.


How long does translation take?​

Typical translation times:

Documentation SizeLanguagesTime
50 pages310-15 minutes
100 pages320-30 minutes
200 pages340-60 minutes
100 pages1060-90 minutes

Updates with translation memory: 2-5 minutes for typical changes (only retranslates changed content)

Factors affecting speed:

  • Number of pages
  • Content complexity
  • Number of target languages
  • LLM provider rate limits

Can I customize translations?​

Yes! Several ways to customize:

1. Translation Memory (Recommended)

# Provide preferred translations for specific terms
from transaurus import GlobalTMManager

tm = GlobalTMManager()
tm.add_translation(
source_text="webhook",
target_language="es",
translation="webhook" # Keep as-is, don't translate
)

2. Source Content Improvements

  • Use consistent terminology in English
  • Provide context for abbreviations
  • Add glossary sections

3. Post-Translation Editing

  • Edit translated files directly in GitHub
  • Translation memory remembers your edits
  • Future updates preserve your customizations

Enterprise plans (coming Q2 2025) include:

  • Custom terminology glossaries
  • Domain-specific tuning
  • Human review workflows
  • Translation approval processes

What languages are supported?​

100+ languages powered by Claude AI.

Most popular languages:

  • Spanish (es), French (fr), German (de)
  • Japanese (ja), Korean (ko)
  • Chinese Simplified (zh-Hans), Chinese Traditional (zh-Hant)
  • Portuguese (pt), Russian (ru), Arabic (ar)
  • Italian (it), Dutch (nl), Polish (pl)

Quality by language:

  • Major languages (es, fr, de, ja, zh, ko): 91-95/100
  • European languages (it, pt, nl, pl, sv, da): 88-92/100
  • Asian languages (th, vi, id): 85-90/100
  • Other languages: 80-88/100

See Configuration β†’ Supported Languages for complete list.


Is my source code safe?​

Yes. PageTurner takes security seriously:

Data handling:

  • βœ… Code is processed via secure APIs (GitHub, Anthropic)
  • βœ… No code stored on PageTurner servers
  • βœ… LLM providers (Anthropic) have enterprise security
  • βœ… All connections use HTTPS/TLS encryption

Access control:

  • βœ… You control GitHub permissions (read-only access is sufficient)
  • βœ… Translations created in repositories you own
  • βœ… No third-party access to your code

Open source:

  • βœ… Translation engine is open source (review code yourself)
  • βœ… No proprietary black boxes

Note: LLM providers (like Anthropic) process your content for translation but don't use it for model training (per their enterprise agreements).


Technical Questions​

What gets created when I run a translation?​

GitHub repositories:

  • Mirror repositories (one per target language)
    • Example: docs-es, docs-fr, docs-zh-Hans
  • Full copy of source repository with translations
  • GitHub Actions workflow for auto-updates
  • Security settings configured

Files created/modified:

  • Translated MDX/Markdown files (in docs/, blog/, etc.)
  • Translated JSON configuration files
  • Updated docusaurus.config.js with i18n settings
  • Updated sidebars with translated labels
  • Translation memory files (.tm/ directory)

Vercel deployment (if token provided):

  • Vercel project linked to GitHub repo
  • Automatic build and deployment
  • Production URLs (e.g., https://docs-es.vercel.app)
  • Automatic deployments on future commits

Database records:

  • Repository configuration in Supabase
  • Translation memory entries
  • Status tracking and metrics

How do I handle updates to my documentation?​

Simple: Just re-run the same command.

# Initial translation
result = master.run_full_translation_pipeline()

# ... time passes, you update your docs ...

# Update translations (only changed content gets retranslated)
result = master.run_full_translation_pipeline()

What happens:

  1. Change detection: SHA256 hashing identifies changed content
  2. Selective translation: Only changed segments get retranslated
  3. Cost savings: 60-80% reduction (only pay for what changed)
  4. Auto-deployment: Changes automatically deployed to Vercel

Example:

  • First run: 100 pages β†’ 300 translation requests
  • Update 5 pages: 5 pages β†’ 15 translation requests
  • Savings: 95% cost reduction

Automation (optional): Set up GitHub Actions to auto-translate on every commit to main branch.


Will my React components break?​

No. PageTurner was built specifically for Docusaurus and preserves all JSX/MDX components.

What gets preserved:

  • βœ… React component tags (<Tabs>, <TabItem>, custom components)
  • βœ… JSX attributes and props
  • βœ… Import statements
  • βœ… Code blocks (never translated)
  • βœ… Inline code expressions
  • βœ… Component nesting and hierarchy

What gets translated:

  • βœ… Text content inside components
  • βœ… String attributes (like label="Click here")
  • βœ… YAML frontmatter
  • βœ… Markdown text

Example:

Before:

<Tabs>
<TabItem value="js" label="JavaScript">
Install the package:
(code block: npm install watermelondb)
</TabItem>
</Tabs>

After (Spanish):

<Tabs>
<TabItem value="js" label="JavaScript">
Instala el paquete:
(code block: npm install watermelondb)
</TabItem>
</Tabs>

If components DO break: This is a bug! Contact support immediately with your repository URL.


Can I translate only specific pages or sections?​

Yes! Use file allow/disallow patterns.

master = DocusaurusMaster(
source_repo_url="https://github.com/your-org/docs",
target_languages=["es", "fr"],

# Only translate docs/ folder
file_allow_patterns=[r"docs/.*\.mdx?$"],

# Exclude internal documentation
file_disallow_patterns=[
r"docs/internal/.*",
r"docs/draft/.*"
],
)

Use cases:

  • Translate only public documentation
  • Skip draft/WIP pages
  • Exclude internal notes
  • Translate blog separately from docs

What happens if translation fails mid-process?​

PageTurner is resilient to interruptions:

Automatic recovery:

  • βœ… Translation memory saves progress continuously
  • βœ… Completed translations are preserved
  • βœ… Re-running continues from last successful state
  • βœ… Failed segments automatically retried

What gets saved:

  • Completed file translations
  • Translation memory entries
  • Repository state
  • Deployment status

To recover: Simply re-run the same command - it will skip completed work and resume.


Troubleshooting​

"GitHub repository creation failed"​

Possible causes:

1. Insufficient permissions

Error: Resource not accessible by integration

Fix: Ensure GitHub PAT has repo scope:

  1. Go to GitHub Settings β†’ Tokens
  2. Click your token
  3. Check repo is selected
  4. Regenerate if needed

2. Target organization doesn't exist

Error: Organization 'your-org-i18n' not found

Fix: Either create the organization or omit github_target_org parameter:

# Use default (creates in same org as source)
master = DocusaurusMaster(
source_repo_url="https://github.com/your-org/docs",
# Don't specify github_target_org
)

3. Repository name already exists

Error: Repository 'docs-es' already exists

Fix: Either delete the existing repo or use a different target org.


"LLM translation error" or "Anthropic API error"​

Possible causes:

1. Invalid API key

Error: Invalid API key

Fix: Verify your Anthropic API key:

  1. Visit Anthropic Console
  2. Check key is active
  3. Copy key exactly (starts with sk-ant-)

2. Insufficient credits

Error: Insufficient credits

Fix: Add credits to Anthropic account:

  1. Go to Billing in Anthropic Console
  2. Add $20+ (covers ~50,000 pages)

3. Rate limit exceeded

Error: 429 Too Many Requests

Fix: Rate limits vary by account tier:

  • Free tier: 50 requests/minute
  • Paid tier: 1000 requests/minute

PageTurner automatically retries with backoff. If persists, reduce concurrency or upgrade Anthropic tier.


"Vercel deployment shows 404" or "Build failed"​

Possible causes:

1. Missing vercel.json

Error: Build failed - No build command specified

Fix: Ensure source repository has proper Docusaurus configuration. PageTurner copies build configuration from source repo.

2. Build errors in translated content Check Vercel build logs:

  1. Go to Vercel dashboard
  2. Select deployment
  3. View build logs

Common issues:

  • Broken Markdown syntax (usually preserved correctly)
  • Missing imports (check if source builds)
  • Configuration errors

Fix: If PageTurner introduced errors, contact support with build logs.

3. Wrong branch deployed Fix: Check Vercel project settings:

  1. Go to Vercel project settings
  2. Git β†’ Production Branch
  3. Should be main or master

"Translation quality is poor" or "Doesn't sound natural"​

Causes and fixes:

1. Source content quality

  • Problem: Source docs use inconsistent terminology or unclear writing
  • Fix: Improve source documentation first
    • Use consistent terms in English
    • Define abbreviations on first use
    • Write clearly and concisely

2. Missing context

  • Problem: Technical terms without explanation
  • Fix: Add brief explanations or glossary in source docs

3. Wrong terminology

  • Problem: Technical term translated incorrectly
  • Fix: Use translation memory to provide preferred translation:
from transaurus import GlobalTMManager

tm = GlobalTMManager()
tm.add_translation(
source_text="webhook",
target_language="es",
translation="webhook" # Keep as-is
)

4. Cultural mismatch

  • Problem: Some expressions don't translate well culturally
  • Fix: Rephrase source content or post-edit translation

Expected quality:

  • Technical documentation: 91-95/100 (excellent)
  • Marketing content: 85-90/100 (good, may need human review)
  • Creative content: 80-85/100 (acceptable, likely needs editing)

"Translation is taking too long"​

Normal durations:

  • 100 pages, 3 languages: 20-30 minutes
  • 200 pages, 5 languages: 60-90 minutes

If significantly slower:

1. Check LLM provider rate limits

  • Free tier: 50 req/min β†’ slower processing
  • Paid tier: 1000 req/min β†’ faster

2. Large content segments

  • Very long pages take longer to translate
  • Consider breaking up mega-pages

3. Network issues

  • Check internet connection
  • Verify API endpoints accessible

4. High concurrency

  • PageTurner processes up to 100 files in parallel
  • May occasionally hit rate limits and back off

To monitor progress: Check console output for current translation status.


"Cost is higher than expected"​

Understanding costs:

Token usage depends on:

  • Content length (longer pages = more tokens)
  • Number of languages
  • Translation memory reuse (first run vs. updates)

To estimate costs BEFORE running:

# Use DRY_RUN mode
result = master.run_full_translation_pipeline(run_mode="DRY_RUN")
print(f"Estimated cost: ${result['estimated_cost']}")

To reduce costs:

  1. Use translation memory (60-80% savings on updates)
  2. Translate incrementally (start with high-priority pages)
  3. Exclude unnecessary files (use file_allow_patterns)
  4. Share TM across projects (reuse translations)

Still Need Help?​

Get Support​

Documentation:

Contact us:

Community:

Response times:

  • Beta users: 24-48 hours
  • Open source projects: 2-5 business days
  • Enterprise: 4-hour SLA (coming Q2 2025)

Didn't Find Your Question?​

We're continuously updating this FAQ based on real user questions.

Submit your question:

  1. Contact us with your question
  2. We'll answer personally
  3. We'll add it to this FAQ for others

Help us improve: If you found this FAQ helpful (or not), let us know!