Connect v2 provides new features and simplifies some common APIs. In addition, it makes use of all the enhancements of Protobuf-ES v2. If you're currently using Connect v1, this document walks you through all you need to know to migrate and start using it right away.
Important
- Node 16 is no longer supported. Connect v2 now supports Node versions 18.14.1 and up.
- TypeScript 4.1 is no longer supported. Connect v2 now requires at least TypeScript v4.9.5.
To help with the process of migrating, we provide a tool called @connectrpc/connect-migrate which will take care of dependency and plugin updates as well as a few minor code changes. As a first step, execute the following command:
npx @connectrpc/connect-migrate@latest
While the tool will do a lot of the dependency legwork for you, there are many use cases in code that it does not cover. Please read on to understand what has changed and how you can leverage new features your application.
Tip
If you use generated SDKs, read the section Upgrading generated SDKs
before you run connect-migrate
.
One important dependency change to be aware of is that the plugin
protoc-gen-connect-es
has been removed in v2. Connect now relies on service
descriptors generated by the Protobuf-ES v2 plugin protoc-gen-es
and no longer
generates code itself. Therefore, that dependency must be removed entirely
from package.json
.
Your mileage may vary according to what @bufbuild
and @connectrpc
packages you depend on, but a list of relevant, compatible
dependencies should look similar to the following:
"dependencies": {
"@bufbuild/protobuf": "^2.2.0",
"@bufbuild/protoc-gen-es": "^2.2.0",
"@connectrpc/connect": "^2.0.0",
- "@connectrpc/protoc-gen-connect-es": "^1.0.0",
"@connectrpc/connect-web": "^2.0.0",
"@connectrpc/connect-node": "^2.0.0",
"@connectrpc/connect-next": "^2.0.0",
"@connectrpc/connect-fastify": "^2.0.0",
"@connectrpc/connect-express": "^2.0.0",
"@connectrpc/connect-query": "^2.0.0",
"@connectrpc/protoc-gen-connect-query": "^2.0.0",
"@connectrpc/connect-playwright": "^0.6.0"
}
✅ The connect-migrate
tool will handle this.
Remove any usage of the protoc-gen-connect-es
plugin from buf.gen.yaml
.
If you are using local plugins:
# buf.gen.yaml
version: v2
plugins:
- local: protoc-gen-es
out: src/gen
include_imports: true
opt: target=ts
- - local: protoc-gen-connect-es
- out: src/gen
- opt: target=ts
If you are using remote plugins:
# buf.gen.yaml
version: v2
plugins:
- - remote: buf.build/bufbuild/es:v1.10.0
+ - remote: buf.build/bufbuild/es:v2.2.0
out: src/gen
include_imports: true
opt: target=ts
- - remote: buf.build/connectrpc/es
- out: src/gen
- opt: target=ts
✅ The connect-migrate
tool will handle this.
The *_connect.ts
files generated by the old plugin are no longer needed. We recommend to use the clean
option
provided by the Buf CLI introduced in v1.36.0:
# buf.gen.yaml
version: v2
+ clean: true
plugins:
- local: protoc-gen-es
out: src/gen
include_imports: true
With this option, buf generate
will delete the contents of src/gen
before generating code.
The plugin option import_extension
has changed behavior: If you are using Node16 module resolution
and need the .js
extension on all import paths, add the plugin option import_extension=js
:
# buf.gen.yaml
version: v2
plugins:
- local: protoc-gen-es
out: src/gen
include_imports: true
opt:
- target=ts
+ - import_extension=js
If you don't want the .js
extension added to import paths, you can remove the
plugin option import_extension=none
- it's the default behavior now:
# buf.gen.yaml
version: v2
plugins:
- local: protoc-gen-es
out: src/gen
include_imports: true
opt:
- target=ts
- - import_extension=none
If you have been using the plugin option ts_nocheck=false
, you can remove it
as well - it's the default behavior now.
Now that dependencies and buf.gen.yaml
are updated, the next step is to re-generate code. The
migration tool does not handle code generation, so be sure to do so in whatever
way your project is configured. For example, npx buf generate
or npm run generate
.
Note
Ensure that your buf.gen.yaml
includes the following options to generate
code for imports.
include_imports: true
See the Gotchas section for an explanation.
Now that dependencies are updated and new code is generated, let's go through the changes to your application code.
Once your code is generated and the vestigial *_connect.ts
files are removed,
import paths will need to be updated. This is usually an update from *_connect
to
*_pb
:
- import { ElizaService } from "./gen/eliza_connect";
+ import { ElizaService } from "./gen/eliza_pb";
✅ The connect-migrate
tool will handle this.
In many applications, it's likely that you'll encounter the most significant change in version 2.0 of Protobuf-ES: we no longer generate classes for Protobuf messages. Instead, we generate a schema object and an associated TypeScript type definition for each message.
Instead of the new
keyword, you create a message with a function call:
- import { SayRequest } from "./gen/eliza_connect.js";
+ import { SayRequestSchema } from "./gen/eliza_pb.js";
+ import { create } from "@bufbuild/protobuf";
- const sayRequest = new SayRequest({
+ const sayRequest = create(SayRequestSchema, {
sentence: "Hello",
});
Messages are now plain TypeScript types, which greatly improves compatibility with the ecosystem. For example, messages can be passed from a server-side component in Next.js to a client-side component without losing any data or types.
Because messages no longer have attached class methods, a standalone function is
provided as a replacement. Here is an example for toBinary
:
import { toBinary } from "@bufbuild/protobuf";
import { SayRequestSchema } from "./gen/eliza_pb";
- sayRequest.toBinary();
+ toBinary(SayRequestSchema, sayRequest);
The same applies to the methods equals
, clone
, toJson
, and toJsonString
,
and to the static methods fromBinary
, fromJson
, fromJsonString
.
Warning
Note that messages no longer implement the magic toJSON
method, which serializes
a message with the Protobuf JSON format when it's passed to JSON.stringify
. Make
sure to always serializes to JSON with the toJson
or toJsonString
function.
Messages no longer include a reference to their type, but they have a property for their qualified Protobuf name:
- sayRequest.getType().typeName; // "connectrpc.eliza.v1.SayRequest"
+ sayRequest.$typeName; // "connectrpc.eliza.v1.SayRequest"
To identify an unknown message, the function isMessage
is still supported:
- import { SayRequest } from "./gen/eliza_connect.js";
+ import { SayRequestSchema } from "./gen/eliza_pb.js";
import { isMessage } from "@bufbuild/protobuf";
- if (isMessage(x, SayRequest)) {
+ if (isMessage(x, SayRequestSchema)) {
x.sentence;
}
The PlainMessage<T>
type was used to represent just the fields of a message,
without class methods. Now that messages are plain types, the type is no longer
necessary and has been removed, along with the function toPlainMessage
.
In most cases, you can simply remove usage of PlainMessage
:
- import type { PlainMessage } from "@bufbuild/protobuf";
import type { SayRequest } from "./gen/eliza_pb";
- const sayRequest: PlainMessage<SayReqest> = {
+ const sayRequest: SayReqest = {
$typeName: "connectrpc.eliza.v1.SayRequest",
sentence: "Hello",
};
The $typeName
property is required to identify messages. If you have a use-case
where it's distracting, use built-in types to remove it:
const sayRequest: Omit<SayReqest, "$typeName"> = {
sentence: "Hello",
};
Tip
proto3
messages are plain objects, but with proto2
or Editions, messages
may use the prototype chain to track field presence. See the section
Field presence and default values
in the Protobuf-ES documentation.
If you need messages to be as simple as possible in all cases, see the section
about JSON types.
Similar to PlainMessage
, PartialMessage
has been removed. It was used for
partial initializer objects when creating new messages. For this use case, the
replacement is MessageInitShape
. It retrieves the type from a descriptor for
forwards compatibility.
For other use cases, built-in types are preferred, now that messages are plain types.
proto2
fields with default values are no longer generated as optional properties:
/**
* @generated from field: required int32 num = 3;
*/
- num?: number;
+ num: number;
Tip
In general, this makes working with proto2
messages much more convenient, because
you no longer have to handle undefined
when accessing the property. You can rely
on the default value 0
instead. If you need to distinguish between an absent
value for a field and the default value, use the function isFieldSet
from
@bufbuild/protobuf
. You can learn more in the section
Field presence and default values
in the Protobuf-ES documentation.
The well-known type google.protobuf.Struct
is now generated as a more-convenient
JsonObject
when used as a message field:
/**
* @generated from field: google.protobuf.Struct struct = 1;
*/
- struct?: Struct;
+ struct?: JsonObject;
Tip
This feature makes it very easy to work with Struct
fields:
myMessage.struct = {
text: "abc",
number: 123,
};
All well-known types have been moved to the subpath export @bufbuild/protobuf/wkt
.
For example, if you want to refer to google.protobuf.Timestamp
:
- import type { Timestamp } from "@bufbuild/protobuf";
+ import type { Timestamp } from "@bufbuild/protobuf/wkt";
Helpers that were previously part of the generated class are now standalone
functions, also exported from @bufbuild/protobuf/wkt
:
- import type { Timestamp } from "@bufbuild/protobuf";
+ import type { Timestamp } from "@bufbuild/protobuf/wkt";
+ import { timestampDate } from "@bufbuild/protobuf/wkt";
const timestamp: Timestamp = ...
- const date: Date = timestamp.toDate();
+ const date: Date = timestampDate(timestamp);
Reflection has received a major update in Protobuf-ES v2, and is much more capable and flexible now. Instead of providing minimal field information from generated code, full Protobuf descriptors are available.
Here is an example for finding the JSON name of a field:
- SayRequest.fields.list()
+ SayRequestSchema.fields
.find((f) => f.localName === "sentence")
?.jsonName;
All common use cases continue to be supported, but types and properties have changed to consolidate the API and provide new features:
Old type | Replacement |
---|---|
MessageType |
DescMessage |
EnumType |
DescEnum |
FieldInfo |
DescField |
OneofInfo |
DescOneof |
ServiceType |
DescService - also see Changes to service descriptors |
MethodInfo |
DescMethod |
Tip
For more information, see the Protobuf reflection documentation.
Registries are commonly used for serializing the well-known type google.protobuf.Any
.
To create a registry, you simply pass the schema objects instead of message classes:
import { createRegistry } from "@bufbuild/protobuf";
- import { SayRequest, SayResponse } from "./gen/eliza_pb";
+ import { SayRequestSchema, SayRequestSchema } from "./gen/eliza_pb";
const registry = createRegistry(
- SayRequest, SayResponse,
+ SayRequestSchema, SayResponseSchema,
);
Tip
In the new version, registries also support files, and can be composed or mutated. See the documentation on Protobuf registries to learn more.
Promise clients are now the default and the previously-deprecated function createPromiseClient
has been removed. Update any call sites using createPromiseClient
to use createClient
.
- import { createPromiseClient } from "@connectrpc/connect-node";
+ import { createClient } from "@connectrpc/connect-node";
- createPromiseClient(ElizaService, transport);
+ createClient(ElizaService, transport);
✅ The connect-migrate
tool will handle this.
The gRPC Transport now requires HTTP/2. If you are using createGrpcTransport
and specifying an httpVersion
, it will fail compilation. Remove the
httpVersion
property to use the default of HTTP/2.
import { createGrpcTransport } from "@connectrpc/connect-node";
createGrpcTransport({
baseUrl: "https://demo.connectrpc.com",
- httpVersion: "2",
});
Note that if you were relying on HTTP/1.1 as part of your gRPC strategy, this may require bigger architectural changes, but the hope is that this is not a common problem.
We have removed the credentials
option from transports as well as the init
option in interceptors. These two options were used to customize fetch
routines.
To set the fetch option credentials
, provide a fetch override:
createConnectTransport({
baseUrl: "/",
- credentials: "include",
+ fetch: (input, init) => fetch(input, { ...init, credentials: "include" }),
});
JSON serialization options (passed to a transport or a server plugin) have
been updated in Protobuf-ES v2. The option typeRegistry
has been renamed to
registry
:
import { createRegistry } from "@bufbuild/protobuf";
import { createConnectTransport } from "@connectrpc/connect-web";
- import { SayRequest } from "./gen/eliza_pb";
+ import { SayRequestSchema } from "./gen/eliza_pb";
const transport = createConnectTransport({
baseUrl: "https://demo.connectrpc.com",
jsonOptions: {
- typeRegistry: createRegistry(SayRequest),
+ registry: createRegistry(SayRequestSchema),
},
});
Tip
Registries have received a major update in Protobuf-ES v2 and are much more capable and flexible now. For more information, see the Protobuf registry documentation.
JSON serialization options (passed to a transport or a server plugin) have
been updated in Protobuf-ES v2. The option emitDefaultValues
has been renamed to
alwaysEmitImplicit
:
await server.register(
fastifyConnectPlugin,
{
routes,
jsonOptions: {
- emitDefaultValues: true,
+ alwaysEmitImplicit: true,
},
},
);
Tip
When this option is enabled, proto3
default values such as 0
, """
, or false
are serialized to JSON, but proto2
default values are not.
Connect now relies on service descriptors generated by the Protobuf-ES v2 plugin
protoc-gen-es
and no longer generates code itself. The type for service
descriptors changes from ServiceType
to DescService
from @bufbuild/protobuf
.
The descriptors still provide the same functionality - typed metadata for clients
and servers - but in a slightly different form. Types and properties have changed
to consolidate the reflection APIs and to provide new features.
Tip
Service and method descriptors provide access to custom options now, which can be very useful in interceptors to control authorization and other details. See the documentation for Protobuf custom options to learn more.
The methods
property is renamed to method
:
- const say = ElizaService.methods.say;
+ const say = ElizaService.method.say;
Tip
methods
provides an Array of the methods.
To distinguish between unary and streaming RPCs, use the methodKind
property:
- import { MethodKind } from "@bufbuild/protobuf";
- say.kind; // MethodKind.Unary
+ say.methodKind; // "unary"
The enum MethodIdempotency
has been replaced by the well-known type MethodOptions_IdempotencyLevel
:
- import { MethodIdempotency } from "@bufbuild/protobuf";
+ import { MethodOptions_IdempotencyLevel } from "@bufbuild/protobuf/wkt";
- say.idempotency; // MethodIdempotency.NoSideEffects
+ say.idempotency; // MethodOptions_IdempotencyLevel.NoSideEffects
Interceptors for streaming RPCs now use appropriate stream types. In
v1, the server used UnaryRequest
and StreamResponse
for server-streaming RPCs,
while the client always uses streaming variants. This was unintended behavior
and has been fixed in v2. Now all streaming RPCs use the StreamRequest
and
StreamResponse
types on the server as well.
The init
property with fetch options has been removed. As a replacement to
determine whether an incoming request is a Connect GET request in server-side
interceptors, the property requestMethod: string
has been added to intercepted
requests. This property is symmetrical to HandlerContext.requestMethod
.
Because method descriptors are now self-sufficient, it is no longer necessary to
pass the service descriptor to ConnectRouter.rpc
, and the argument has been
removed from the method signature. Update your call-sites as follows:
const routes = ({rpc}: ConnectRouter) => {
- rpc(ElizaService, ElizaService.say, impl);
+ rpc(ElizaService.say, impl);
}
Note
The same change applies to the Transport
interface. This is only relevant if
you access Transports
without a client, or if you have implemented your own
custom transport.
If you raise a Connect error on a server, you can include arbitrary Protobuf messages as error details. The syntax to provide them has changed slightly: In Connect v1, error details were specified as message instances. In v2, error details are now an object that specifies both a message schema and initialization object. For example:
- import { LocalizedMessage } from "./gen/google/rpc/error_details_pb";
- const details = [
- new LocalizedMessage({
- locale: "fr-CH",
- message: "Je n'ai plus de mots.",
- }),
- ];
+ import { LocalizedMessageSchema } from "./gen/google/rpc/error_details_pb";
+ const details = [
+ {
+ desc: LocalizedMessageSchema,
+ value: {
+ locale: "fr-CH",
+ message: "Je n'ai plus de mots.",
+ }
+ },
+ ];
const metadata = new Headers({
"words-left": "none"
});
throw new ConnectError(
"I have no words anymore.",
Code.ResourceExhausted,
metadata,
details,
);
The connect-migrate
tool does not upgrade generated SDKs at this point in time.
We recommend that you use the following steps to upgrade:
- Uninstall old generated SDKs
- Run the migration tool
- Install new generated SDKs
- Update your application code
Since the Connect plugin no longer exists in v2, any generated
SDK dependencies in your package.json
that rely on this plugin (i.e. have
connectrpc_es
as part of their name) must be updated to use the Protobuf-ES
v2 plugin instead.
The name of a generated SDK dependency is structured as follows:
@buf/{module_owner}_{module_name}.{plugin_owner}_{plugin_name}
For example, if you are using a generated SDK for the BSR module buf.build/googleapis/googleapis,
you have dependency on @buf/googleapis_googleapis.connectrpc_es
in your
package.json file.
Run the following command to remove the dependency:
npm remove @buf/googleapis_googleapis.connectrpc_es
To do this for other modules, simply replace googleapis/googleapis
with your module owner and name.
npx @connectrpc/connect-migrate@latest
This command will update your dependencies on @connect
and @bufbuild
packages,
so that you can install the new generated SDK.
Now you need to replace the old generated SDK with a dependency on @buf/googleapis_googleapis.bufbuild_es
(same BSR module, but using the Protobuf-ES v2 plugin):
npm install @buf/googleapis_googleapis.bufbuild_es@latest
Your package.json
should now resemble the following:
"dependencies": {
...
- "@buf/googleapis_googleapis.connectrpc_es": "^1.6.1-20241107203341-553fd4b4b3a6.1",
+ "@buf/googleapis_googleapis.bufbuild_es": "^2.2.2-20241107203341-553fd4b4b3a6.1",
"@connectrpc/connect-web": "^2.0.0",
"@bufbuild/protobuf": "^2.2.0",
...
}
Note
Your versions will differ per module.
Now your dependencies are updated, and you can follow the guide to Update your application code.
Make sure to update your imports to the new package name and file names:
- import { ByteStream } from "@buf/googleapis_googleapis.connectrpc_es/google/bytestream_connect.js";
+ import { ByteStream } from "@buf/googleapis_googleapis.bufbuild_es/google/bytestream_pb.js";
Because Protobuf-ES supports custom options and other reflection-based features now, generated code includes more information than in the previous version, and will generate additional imports in some situations.
For example, if you have a Protobuf message that uses validation rules from buf.build/bufbuild/protovalidate, the Protobuf file has an import for the validation options:
import "buf/validate/validate.proto";
The old plugin ignored this import, but the new plugin will generate a corresponding ECMAScript import:
+ import { file_buf_validate_validate } from "./buf/validate/validate_pb";
The imported file is not generated by default. To include imports, add the following option to your buf.gen.yaml config:
# buf.gen.yaml
version: v2
plugins:
- local: protoc-gen-es
out: src/gen
+ include_imports: true
Connect-ES and Protobuf-ES use package exports. If you see the following error with Parcel, make sure to enable package exports:
@parcel/core: Failed to resolve '@bufbuild/protobuf/codegenv1'
Connect-ES and Protobuf-ES use package exports. If you see the following error with Metro or Expo, make sure to enable package exports:
Metro error: Unable to resolve module @bufbuild/protobuf/codegenv1
Previously, Connect allowed request objects with matching shapes to be passed to API calls interchangeably as long as the passed object was a superset of the target type. For example, given the following proto definitions:
syntax = "proto3";
package example.v1;
message MessageA {
string field_a = 1;
}
message MessageB {
string field_a = 1;
int64 field_b = 2;
}
service ExampleService {
rpc RequestA(MessageA) returns (Empty) {}
rpc RequestB(MessageB) returns (Empty) {}
}
The following would have passed TypeScript compilation:
client.requestA(new MessageA());
client.requestA(new MessageB());
This was an unintended bug and not a feature. In Connect v2, only the specified target type will pass compilation.
client.requestA(create(MessageASchema));
client.requestA(create(MessageBSchema)); // Type Error: Argument of type MessageBSchema is not assignable to parameter of type MessageInit<MessageASchema>
If you intend to pass a message as a different message with the same fields,
you can use object destructuring to drop the $typeName
, and copy the rest
of the properties:
const messageA: MessageA = ...;
const { $typeName: _, ...properties } = messageA;
const messageB = create(MessageBSchema, properties);