/opengraph-image.png)
MyCellar.ai is an AI-powered wine selection assistant that helps you discover the perfect wine from your cellar based on your collection and preferences.
Features · Model Providers · Deploy Your Own · Running locally
- Next.js App Router
- Advanced routing for seamless navigation and performance
- React Server Components (RSCs) and Server Actions for server-side rendering and increased performance
- AI SDK
- Unified API for generating text, structured objects, and tool calls with LLMs
- Hooks for building dynamic wine selection and generative user interfaces
- Supports xAI (default), OpenAI, Fireworks, and other model providers
- shadcn/ui
- Styling with Tailwind CSS
- Component primitives from Radix UI for accessibility and flexibility
- Data Persistence
- Neon Serverless Postgres for saving wine selection history and user data
- Vercel Blob for efficient file storage
- Auth.js
- Simple and secure authentication
MyCellar.ai ships with xAI grok-2-1212
as the default AI model for wine recommendations. However, with the AI SDK, you can switch LLM providers to OpenAI, Anthropic, Cohere, and many more with just a few lines of code.
You can deploy MyCellar.ai to Vercel by connecting your repository and setting up the required environment variables.
You will need to use the environment variables defined in .env.example
to run MyCellar.ai locally. It's recommended you use Vercel Environment Variables for this, but a .env
file is all that is necessary.
Note: You should not commit your
.env
file or it will expose secrets that will allow others to control access to your various AI and authentication provider accounts.
- Install Vercel CLI:
npm i -g vercel
- Link local instance with Vercel and GitHub accounts (creates
.vercel
directory):vercel link
- Download your environment variables:
vercel env pull
pnpm install
pnpm dev
Your app template should now be running on localhost:3000.