Server-Side · Self-Hosted · CAPI-Native
One CAPI layer for Meta, TikTok, Google, and whatever's next.
Server-side conversions you can see. Transform logic you own. Customer data that never leaves your region. One source, every ad network, full per-event observability.
- Real-Time Delivery History
- JavaScript Transform Sandbox
- Self-Hosted by Default
Hosted in your region · GDPR-aligned by architecture · Self-hosted option available
function transform({ events, secrets }) {
return events
.filter(e => e.event_type === 'purchase')
.map(e => ({
event_type: 'purchase',
event_time: e.event_time,
event_payload: {
event_name: 'Purchase',
event_time: Math.floor(
new Date(e.event_time).getTime() / 1000
),
user_data: {
em: e.event_payload.hashed_email,
ph: e.event_payload.hashed_phone,
fbp: e.event_payload.fbp,
fbc: e.event_payload.fbc,
},
custom_data: {
currency: e.event_payload.currency,
value: e.event_payload.revenue,
},
event_id: e.event_payload.order_id,
action_source: 'website',
}
}));
} CAPI is supposed to fix attribution. For most teams, it just moves the problem.
Black-box delivery
You toggle it on and hope. Failures show as ROAS drops, not errors.
Vendor-locked transforms
Platforms change schemas. Vendor templates lag. Your events are wrong until someone else ships.
Data residency as an afterthought
Hashed PII flows through someone else's cloud. SOC 2 doesn't tell you which region.
Each new platform multiplies the problem
Meta works. Add TikTok. Then Google. Each one is its own tokens, observability, breakage.
Meiro Pipes is built for exactly this.
Pipes is a self-hosted Customer Data Infrastructure that runs in your VPC, your region. For server-side conversion routing, it gives you three composable building blocks, with full per-event observability between them.
Webhook /collect endpoint. Captures events from your site, app, or backend.
JavaScript sandbox. You write the function. 47 vetted npm modules.
send() function with bound secrets. Real-time delivery records.
Same model for every platform you add.
Stop hoping. See every event you send.
Every event entering a Pipe creates a delivery record. Inspect it in the UI or query it directly. You see the payload that left your instance, the destination's response, the timestamp, and the exact error if delivery failed. No 24-hour delay, no paywall, no support ticket.
If Meta returns a 400 because a payload was malformed, you see it immediately. You know which event caused it.
# Inspect deliveries
mpcli api GET /api/pipes/<pipeId>/deliveries
mpcli api GET /api/event-deliveries/<deliveryId> {
"id": "evd_01J...",
"pipe_id": "pip_01J...",
"event_id": "ord_70213",
"destination": "meta-capi-prod",
"status": "delivered",
"request": {
"url": "https://graph.facebook.com/v19.0/123456789/events",
"payload": {
"data": [ {
"event_name": "Purchase",
"event_id": "ord_70213"
} ]
}
},
"response": {
"status": 200,
"body": {
"events_received": 1,
"fbtrace_id": "..."
}
},
"delivered_at": "2026-05-04T10:14:22.481Z"
} You own the transform. Always.
Each Pipe has a JavaScript transform in a secure sandbox. You decide the shape of every event before it hits the destination. When Meta ships an API change on Tuesday, you ship Tuesday.
function transform({ events, profiles, secrets }) {
return events
.filter(e => e.event_type === 'purchase')
.map(e => ({
event_type: 'purchase',
event_time: e.event_time,
event_payload: {
event_name: 'Purchase',
event_time: Math.floor(new Date(e.event_time).getTime() / 1000),
user_data: {
em: e.event_payload.hashed_email,
ph: e.event_payload.hashed_phone,
fbp: e.event_payload.fbp,
fbc: e.event_payload.fbc,
client_ip_address: e.event_payload.ip,
client_user_agent: e.event_payload.ua,
},
custom_data: {
currency: e.event_payload.currency,
value: e.event_payload.revenue,
},
event_id: e.event_payload.order_id,
action_source: 'website',
},
}));
} // Verify against TikTok Events API v1.3 before going to production.
function transform({ events, profiles, secrets }) {
return events
.filter(e => e.event_type === 'purchase')
.map(e => ({
event_type: 'purchase',
event_time: e.event_time,
event_payload: {
event: 'CompletePayment',
event_time: Math.floor(new Date(e.event_time).getTime() / 1000),
event_id: e.event_payload.order_id,
user: {
email: e.event_payload.hashed_email,
phone_number: e.event_payload.hashed_phone,
ttp: e.event_payload.ttp,
ttclid: e.event_payload.ttclid,
ip: e.event_payload.ip,
user_agent: e.event_payload.ua,
},
properties: {
currency: e.event_payload.currency,
value: e.event_payload.revenue,
order_id: e.event_payload.order_id,
},
page: {
url: e.event_payload.page_url,
},
},
}));
} // Verify against Google Ads API uploadClickConversions before going to production.
function transform({ events, profiles, secrets }) {
return events
.filter(e => e.event_type === 'purchase')
.map(e => ({
event_type: 'purchase',
event_time: e.event_time,
event_payload: {
conversion_action: secrets.GOOGLE_ADS_CONVERSION_ACTION,
conversion_date_time: new Date(e.event_time)
.toISOString()
.replace('T', ' ')
.replace('Z', '+00:00'),
conversion_value: e.event_payload.revenue,
currency_code: e.event_payload.currency,
order_id: e.event_payload.order_id,
gclid: e.event_payload.gclid,
user_identifiers: [
{ hashed_email: e.event_payload.hashed_email },
{ hashed_phone_number: e.event_payload.hashed_phone },
],
},
}));
} 47 npm modules available inside the sandbox: crypto, jose, date-fns, zod, lodash-es, and more. Hash, validate, reshape, and enrich without leaving the function.
Tokens never touch your code.
The Pipe transform shapes the event. The Event Destination delivers it. Both pull from the same Secrets store: named, isolated, never logged. Rotate a token by updating the secret. The destination's send function never changes.
Sandbox blocks access to private IPs, internal services, and localhost. Transforms can't exfiltrate secrets.
async function send({ events, secrets }) {
const res = await fetch(
`https://graph.facebook.com/v19.0/${secrets.META_PIXEL_ID}/events`,
{
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
data: events.map(e => e.event_payload),
access_token: secrets.META_ACCESS_TOKEN,
}),
}
);
return res.json();
} # Store the token. Never paste it into code.
printf '%s' "$META_TOKEN" | mpcli secrets create \
--key META_ACCESS_TOKEN \
--value-stdin \
--description "Meta CAPI pixel token"
# Rotate without touching the send function
printf '%s' "$NEW_TOKEN" | mpcli secrets update-value \
<secretId> --value-stdin Add a platform, not a project.
One Source
Webhook /collect endpoint. Same identity resolution, same event types.
N Destinations
Each platform is one send function and one secret. Add or remove without touching the source.
One observability surface
Every delivery, every platform, in one place.
Your data stays where you deploy it.
Meiro Pipes runs in your AWS VPC, your Azure region, or your on-premise data center. Customer events, including hashed PII, never traverse a vendor cloud. No Meiro layer sits in the data path.
Self-hosted by default
Deploy in any region. EU-only, US-only, multi-region. The vendor never sees your events.
Secrets isolated at runtime
Tokens live in named Secrets, scoped to the destination, accessible only inside the sandbox. Never logged, never visible in delivery history.
Audit trail on every entity
Every Pipe, destination, and transform is versioned. Revert in one command if a change broke production.
mpcli api GET /api/pipes/<pipeId>/versions
mpcli api POST /api/pipes/<pipeId>/revert --json '{"version": 3}' If your DPA says customer data stays in the EU, Meiro Pipes can guarantee it structurally.
Deploying Pipes takes engineering. Running it doesn't.
Pipes is self-hosted, so the deploy is real engineering work. Once it's running, the operational model is small and scriptable.
- Create a Webhook Source. Generates a
/collect/<slug>endpoint your site or backend calls. - Define event types. Declare
purchase,add_to_cart,lead, with the schema each one expects. - Create a Secret. Store the platform access token. Never paste it into code.
- Create an Event Destination. Write the
send()function for Meta, TikTok, or Google. - Create a Pipe. Connect the Source to the Destination. Add a transform if needed.
- Test end-to-end. Send a test event. Inspect the delivery record. Verify the platform received it.
# Create destination
mpcli api POST /api/event-destinations --file ./meta-capi-destination.json
# Bind secrets
mpcli api PUT /api/event-destinations/<id>/secrets --json '{"secretIds":["<secretId>"]}'
# Create pipe
mpcli api POST /api/pipes --file ./purchase-to-meta.json
# Enable
mpcli api PUT /api/pipes/<id>/toggle
mpcli api PUT /api/event-destinations/<id>/toggle
# Verify
mpcli api GET /api/pipes/<id>/deliveries Config transfer staging → production
Validated in staging? Export the configuration, import it into production. No re-creation, no copy-paste errors.
Failure alerts out of the box
Wire Slack or Teams webhooks once. The first time a Pipe fails in production, you find out immediately.
What you get
| Capability | Meiro Pipes | Typical SaaS CAPI tool |
|---|---|---|
| Delivery visibility | Real-time, per-event delivery history | Delayed or paywalled |
| Transform logic | Full JavaScript sandbox, 47 npm modules | Vendor templates |
| Schema updates | Edit the function, deploy immediately | Wait on vendor |
| Multi-platform | One platform, one operator model | Separate integrations |
| Data residency | Your infrastructure, your region | Vendor cloud |
| PII handling | Never leaves your environment | Routed through SaaS |
| Audit trail | Full version history on every entity | Limited |
| Secret rotation | mpcli secrets update-value, zero downtime | Manual reconfiguration |
| Failure alerts | Built-in Slack / Teams | Optional add-on |
| Deploy automation | Full mpcli CLI + REST API | UI-only |
"Typical SaaS CAPI tool" represents the common feature set across managed server-side conversion routing platforms. Specific vendor capabilities vary.
Frequently asked questions
How long does it actually take to deploy Pipes?
Infrastructure deploy (Kubernetes, database, workers) is half a day to a day, with Helm charts and a guided setup. Your first CAPI integration takes two to four hours after that. Operations are low-overhead. Most changes are API calls or function edits.
Where does customer data physically reside?
Wherever you deploy the Pipes instance. Deploy in eu-west-1, processing happens in eu-west-1. No relay, no egress. Meiro's control plane syncs licensing and config only. It never touches event data.
How is event deduplication handled?
Pipes passes the event_id straight through. Meta, TikTok, and Google all dedupe server-side when the ID matches the browser-side pixel event. Your transform maps the right field, typically an order ID or client UUID.
What happens when a platform changes its API?
You edit the transform and, if needed, the send function. Both are plain JavaScript, both deploy on save. No vendor release cycle. You respond to API changes on your own timeline.
How do I rotate access tokens?
Pipe the new token into mpcli secrets update-value <secretId> --value-stdin. The next event delivery picks up the new value. No code change, no downtime.
Can I sample or route events conditionally?
Yes. The transform receives the full event array. Filter, sample, or branch on any field. Fan out to multiple destinations by connecting multiple Pipes to the same source.
Is there a hosted option if we don't want to self-host?
Yes. Meiro operates the infrastructure in a region you specify. Events stay in that region. You keep full access to the transform and destination functions. Email [email protected] to discuss.
How does this relate to the marketing-positioned /server-side-capi/ page?
The server-side CAPI page covers the business case: match-rate recovery and ROAS impact. This page covers the engineering: observability, ownership, data residency. Same product, two angles.
Run your CAPI stack on infrastructure you control.
We provision a Pipes instance in your region and walk through your first server-side delivery on the call.
Or email [email protected]