-
Our project is at about 40 end points using tRPC V9 with react and we're experiencing a pretty substantial slowdown of typescript. Mutations in particular are excruciatingly slow at this point. We're using zod schemas for both input and output, if that matters. Are there any guidelines or resources for potentially restructuring our project or something in order to improve TS performance? I know V10 should significantly improve performance but we're needing to do a lot of work with V9 before it is released. Edit: Have been trying a few different things to try to speed up performance. I'm not having any luck speeding up intellisense, but I was able to start typescript in watch mode which type checks much faster (about 3 seconds instead of like 15). It pretty much has all the same benefits as the intellisense version, other than autocomplete (autocomplete isn't having bad performance), just will have to get used to looking at the console. 3 seconds is fine by me for now, it's still light years faster than having to open a swagger doc or something. There are some options that are pretty much required to have decent compile times, specifically I'll keep looking into ways to potentially improve performance. Hopefully at some point we can have some sort of documentation on how to deal with this sort of thing in the context of tRPC. |
Beta Was this translation helpful? Give feedback.
Replies: 8 comments 14 replies
-
What you've tried sounds like the best ways to improve performance for v9. You could also try the v10 alpha but you'd be susceptible to any changes. Another thing you can try is project references but that's much more complicated. |
Beta Was this translation helpful? Give feedback.
-
So just an update on this - Our number of endpoints continued to grow, and eventually typescript just died. We got a "type instantiation is excessively deep" error past a certain number of end points, which makes it unusable. However, I came up with a hack that allows us to fix typescript in a way that allows it to scale to any number of end points, but it is slightly more complex. Basically the idea is you split the type into multiple routers, then you have a custom hook that uses the react query hooked typed to just one router. So you end up with a bunch of different useUserQuery
useCommentsQuery
useToDoQuery And each one of those hooks only has the type for one router. Not ideal, but does fix the typescript issues we're facing and we still get all the benefits of using tRPC. Not too upset with this solution, at least I can have peace of mind that our typescript will scale now (as long as our routers are sized reasonably). Plus it'll fix the slowdowns. Going to implement this fully on Monday when I'm back on the project, I'll post the solution here in case anyone else needs to fix this in V9 |
Beta Was this translation helpful? Give feedback.
-
Alright so one last followup just in case anyone sees this thread in the future. The above solution ended up breaking down on us as well. The auto complete / intellisense became so slow that it was pretty much impossible to write code (since auto imports rely on intellisense, it's blocking to wait on it). So, we had to get our auto imports back so the first thing we did was use Eventually we came up with a solution for explicitly typing everything that gives us that sweet, sweet full stack type safety. Basically we just manually maintain our own definitions of routers now that look like this, for example: export type UserRouterDefinition = {
queries: {
getLoggedInUser: {
input: EmptyInput
output: UserOutput
}
deleteLoggedInUser: {
input: EmptyInput
output: MessageOutput
}
}
mutations: {
updateLoggedInUser: {
input: UpdateLoggedInUserInput
output: UserOutput
}
updateLoggedInUserPassword: {
input: UpdateUserPasswordInput
output: MessageOutput
}
}
} So this is extra overhead, it's worse than getting it for free like you do when tRPC is functioning properly, but we had to do it because our intellisense died completely which blocked all front end development. Each router requires it's own custom hooks (although we probably could have just one big router): import { createTrpcQueryHook } from "src/trpc/query-hooks/create-trpc-query-hook";
import { QueryOptions, UseMutationOptions, UseMutationResult, UseQueryOptions, UseQueryResult } from "react-query";
import { UserRouterDefinition } from "src/trpc/routers/User.router.definition";
import { useTrpcContext } from "src/trpc";
const [_useQuery, _useMutation] = createTrpcQueryHook('prefix.');
export function useUserQuery<Path extends keyof UserRouterDefinition['queries']>(pathAndInput: [Path, UserRouterDefinition['queries'][Path]['input']], options?: UseQueryOptions) {
return _useQuery(pathAndInput, options) as UseQueryResult<UserRouterDefinition['queries'][Path]['output']>;
}
export function useUserMutation<Path extends keyof UserRouterDefinition['mutations']>(args: [Path] | Path, options?: UseMutationOptions) {
return _useMutation(args, options as any) as UseMutationResult<UserRouterDefinition['mutations'][Path]['output'], unknown, UserRouterDefinition['mutations'][Path]['input']>;
}
export function useUserInvalidateQuery() {
const context = useTrpcContext();
return <Path extends keyof UserRouterDefinition['queries']>(pathAndInput: [Path, UserRouterDefinition['queries'][Path]['input']]) => {
const path = 'user.' + pathAndInput[0];
context.invalidateQueries([path, pathAndInput[1]])
}
}
export function useUserSetQueryData() {
const context = useTrpcContext();
return <Path extends keyof UserRouterDefinition['queries']>(pathAndInput: [Path, UserRouterDefinition['queries'][Path]['input']], updater: (old: UserRouterDefinition['queries'][Path]['output'] | undefined) => (UserRouterDefinition['queries'][Path]['output'] | undefined)) => {
//@ts-ignore
context.setQueryData(pathAndInput, updater)
}
} This might look like a lot to maintain at first glance, but we actually were able to create a script that auto generated all these custom hooks per router. The end result is that we now have to maintain these so called "router definition" files (so adding one additional property to the definition per endpoint), and run a command any time we make changes to generate the hooks Just for additional context:
Additionally I just want to point out how serious slow intellisense performance is as an issue: If your intellisense is loading, you are blocked. If your intellisense slows down signficantly, you are really, really screwed, it's a complete disaster. I didn't realize before this experience how necessary of a tool intellisense is, but it really is a worst case scenario as far as developer productivity goes. Auto imports in particular are absolutely necessary when building anything new. Can't let intellisense slowdowns happen and they have to be avoided at all costs. Anyways, not saying anyone should or shouldn't use tRPC V9, but it's worth documenting that severe enough performance issues can happen such that type inference has to be disabled. I'm still glad I used it, because it pushed our team towards full stack type safety and now we have a highly maintainable codebase (plus it forced me to get better at typescript =D). There was no previous documentation on how to "opt-out" of inference as an escape hatch for these issues, so we had to create one in our case, and it's really not much extra effort in the grand scheme of things versus vanilla tRPC with high typescript performance. |
Beta Was this translation helpful? Give feedback.
-
I am also noticing a significant slowdown using tRPC with only a few endpoints (using Nx, too). I have a fully beefed out Macbook Pro, and it still is excruciating slow. I can't imagine how this is supposed to "work" for anyone. I imagine that everyone regularly runs into this problem. Is there any advice? |
Beta Was this translation helpful? Give feedback.
-
having the same slowness as well. for a quite small project, 15 routes with small to medium queries/mutations (1-5) |
Beta Was this translation helpful? Give feedback.
-
We could improve our situation significantly. As with a lot of you, DX became abysmal with really slow autocomplete. Setup:
The following steps helped: 1. Removal of "context bloat"We had added a few heavy objects to the context in the definition of our procedures (namely, supabase client). This caused extremely large type definitions for every single route. We identified this problem with the next step: 2. Type GenerationInitially, we wanted to use xtrp. However, we had problems that were hard to investigate, so we went with a basic By analyzing the generated files we had initially identified the context bloat. This comes with the obvious drawback of having to generate the types regularly. With a proper setup not an issue for us, though. 3.
|
Beta Was this translation helpful? Give feedback.
-
Just upgrade your ram to minimum 32gb and everything will be a breeze
…On Fri, Apr 5, 2024 at 4:16 PM timlgl ***@***.***> wrote:
@lithdew <https://github.com/lithdew> As presumably most code bases, we
have all routers in a subdirectory. Here we have a separate file
tsconfig.trpc.json mit "outDir": "./types".
In our package json we then just added a "bake" script:
"trpc:bake": "pnpm run trpc:clean && pnpm run trpc:generate",
"trpc:generate": "npx -p typescript tsc --project ./routers/tsconfig.trpc.json",
"trpc:clean": "rm -rf ./routers/types"
Dependencies of the routers will also be generated.
—
Reply to this email directly, view it on GitHub
<#2448 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGI6VFD7U7FAMESBFPEHZILY3ZMWRAVCNFSM56NIM7JKU5DIOJSWCZC7NNSXTOKENFZWG5LTONUW63SDN5WW2ZLOOQ5TSMBRHA2TQOI>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Could be because of prisma or drizzle, there are techniques to optimize
those orm
…On Fri, Apr 5, 2024 at 8:12 PM Kenta Iwasaki ***@***.***> wrote:
I’ve got 64gb and it takes 7 seconds for autocomplete to come up with 32
routes 😅.
—
Reply to this email directly, view it on GitHub
<#2448 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGI6VFB37C5L6CKXKZAXZF3Y32IKNAVCNFSM56NIM7JKU5DIOJSWCZC7NNSXTOKENFZWG5LTONUW63SDN5WW2ZLOOQ5TSMBSGA4DONQ>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
Alright so one last followup just in case anyone sees this thread in the future.
The above solution ended up breaking down on us as well. The auto complete / intellisense became so slow that it was pretty much impossible to write code (since auto imports rely on intellisense, it's blocking to wait on it).
So, we had to get our auto imports back so the first thing we did was use
any
as the router. Of course this turns off all typesafety, which sucks a lot, but we had to write code. To maintain some semblance of typesafety we were importing types directly from the backend. This fixed performance issues, allowed us to write our code, but also left a lot of room for developer error.Eventuall…