Slava Ukraine

About Me - Technical Achievements

The Story

As a child, I was always fascinated by computers. At the age of 20, I decided to pursue a career as a professional software developer. I began taking C++ courses and passed the Plovdiv University (PU) entrance exam with a score of 5.40. However, shortly after that, I moved to Sofia and started working at a company called Freido, where I initially worked with PHP.

Later, I joined StangaOne, where we actively used the Symfony Framework, along with Doctrine & Propel ORM, for backend development. Of course, we also worked with HTML, CSS, and JavaScript. In addition, we integrated various payment methods and implemented numerous other features.


During my 20 years of experience, I have worked for various companies both as a full-time employee and a part-time contractor. Throughout my career, I have developed my skills and gained a deeper understanding of the industry.

I have collaborated with various Bulgarian companies, such as:

As well as international ones, such as:



StangaOne

At StangaOne I have worked on various and complex applications in fields like banking system, e-commerce and etc. We integrated:

  • different payment methods and API's
  • unique and attractive UI's
  • complex server-side & frontend logic
Later on — about a year ago or some time back — they contacted me again, and we collaborated on a project for a large German corporation called Unzer . The application was relatively large, with many internal services and a lot of legacy code.
It included EDA (Event-Driven Architecture) features such as Kafka and RabbitMQ.
As a senior developer, I was part of a team responsible for developing a split-payment service. For example, when you're at checkout and want to pay for your products, the system allows you to pay part of the amount in cash and the rest with a debit card. In the end, the total is calculated, and the bill is settled.
Some of the technologies we used included Vue 2/3, Fastify, Kafka, PostgreSQL, TypeORM, and others.

Diversity.com

One of the most interesting projects I have worked on is Diversity.com — a job posting portal for minorities in the United States.

After presenting several ideas and developing the first version of the website, the client was highly impressed with my work and promoted me to the role of Technical Manager. They handled the sales aspect, while I was responsible for the technical side of the business, including website development, tools, team, and online marketing.

To support the project's growth, we needed to expand the team by hiring additional programmers, designers, and marketers. Once we assembled the team, the business took off significantly, generating a monthly profit of approximately $100,000.

A large part of this success was due to my marketing strategies! One of the biggest achievements was getting the website indexed as the #1 result on Google for the keyword "diversity" — a remarkable milestone, considering how extremely competitive the U.S. market is for top-ranking keywords on Google. This happened around 2017.

Responsibilities:

  • Developing complex applications, integration of third party API's, payment methods
  • Managing marketing strategy
  • Managing team of 4 specialists
  • Communication with stake holders, planning, goal achieving strategy
  • High Level Sytem Design and schemas

Interesting case:

While we were developing this application, there were many additional projects and technical challenges that I had to solve.

One particular case study involved a database with several million records and daily traffic of 50,000 – 70,000 visits on the homepage. The challenge was to implement a slider that retrieves results from the database via AJAX and displays them on the homepage, all while ensuring a load time of under 1 second (based on SEO requirements).

Simply optimizing the database query was not enough—a non-standard solution was required!

I came up with an effective approach: I created a cron job that selects 250 job posts from millions of records. This cron job runs every six hours and stores these 250 job posts in a JSON file. The slider then randomly selects 60 records from this pre-generated JSON file, ensuring fast page load times.


I had another case where people from minorities in BG came to me and asked me to run ads for them on FB (as FB blocks them constantly) and I had to bypass FB protections. As well as many, many other projects and cases.


After Diversity.com, I started activly working with   NODEJS (ExpressJS),   REACT (Nextjs),    TYPESCRIPT and maintly JS technologies. My main tech stack at the moment.


Web3Another interesting project I worked on was for InfiniGods.com, a WEB3Web3What is Web3?

Web3 (or Web 3.0) refers to the next generation of the internet that emphasizes decentralization, blockchain technology, and user ownership of data. Unlike Web2 — where centralized platforms like Google, Facebook, or Amazon control data and services — Web3 aims to shift power and control back to users through open, trustless, and permissionless systems.


gaming platform where users can play, earn NFTsWeb3What is an NFT?

NFT stands for Non-Fungible Token. It is a unique digital asset stored on a blockchain, representing ownership of something distinct — such as artwork, music, videos, virtual real estate, or even game items.
An NFT might represent: A digital painting from an artist, A music track or album, A video clip or meme, An item or skin in a Web3 game, A domain name (like .eth), Access to a private community or event.

, and trade them. The project was originally outsourced to an external agency, but their implementation was poorly done — it used a hybrid setup: PHP for the backend and an outdated version of React (v16) for the frontend. As expected, this caused multiple issues, especially since PHP didn’t support web3 features natively, while JavaScript had already matured with tools and libraries like ethers.js.

As a senior developer, my responsibility was to take over the entire project from the external team, set up and manage a new Linux-based VPS server, and rebuild the application from scratch using NEXTJS (withTypeScript) and ExpressJS for the backend. This new architecture allowed us to implement essential business features properly, including MetaMask wallet integrationMetaMaskWhat is MetaMask?

MetaMask is a cryptocurrency wallet and browser extension (also available as a mobile app) that allows users to:
Store and manage their cryptocurrencies (like ETH).
Interact with decentralized applications (dApps) on the blockchain (mainly Ethereum and compatible chains).
Securely sign blockchain transactions and smart contracts.

, Ethereum smart contract communication, and other blockchain-related functionality.

It was also the first time I learned how to interact with smart contractssmart contractsWhat are Smart Contracts?

Smart contracts are self-executing programs stored on a blockchain that automatically carry out actions when certain conditions are met — without needing intermediaries like banks, lawyers, or centralized servers.

They are a core part of Web3, enabling decentralized applications (dApps) like NFT marketplaces, DeFi platforms, and blockchain games.

written in other languages (e.g., SoliditySolidityWhat is Solidity?

Solidity is a programming language specifically designed for writing smart contracts that run on the Ethereum Virtual Machine (EVM) the core of the Ethereum blockchain.

) directly from JavaScript code, which was a powerful and exciting experience. I gained solid hands-on knowledge of web3, especially around wallet integration, blockchain interaction, and decentralized application (dApp) infrastructure.

This project greatly expanded my expertise in both modern full-stack development and the web3 ecosystem, and I’m truly grateful for the opportunity to work on such cutting-edge technology.
Checkout my simple personal demo with web3 project


We Craft Media
I worked with We Craft Media on a project where we had to build a React & TypeScript interface for managing electric vehicles. The backend was built with ExpressJS. As a senior developer, I was responsible for the frontend, and one of the ideas I proposed at the time was to use Redux Toolkit (RTK)Redux Toolkit (RTK)What is Redux Toolkit (RTK)?

Redux Toolkit (RTK) is the official, recommended way to use Redux, designed to simplify and modernize state management in React applications. It provides a set of powerful tools like createSlice, createAsyncThunk, and configureStore that reduce boilerplate code and makeRedux easier to use. RTK supports best practices by default, such as immutability, code organization, and built-in support for async logic (e.g., data fetching). It also integrates well with TypeScript and works seamlessly with features like optimistic updates and middleware.

, as it was quite advanced for its time and worked very well with the principle of optimistic updatesOptimistic UpdatesWhat is Optimistic Updates?

Optimistic updates are a UI pattern where the application immediately updates the user interface as if an operation succeeded, before receiving confirmation from the server. This makes the app feel faster and more responsive.
For example, when a user clicks "like" on a post, the UI instantly shows the like — even though the actual API call is still in progress. If the server later fails, the app can roll back the change to keep things consistent.
It’s commonly used in apps with real-time interactions (e.g., social media, to-do lists, dashboards) and is supported by tools like Redux Toolkit Query (RTK Query) and React Query.

, among other things. This significantly improved the app's productivity. The application itself was complex, as managing electric vehicles involved many features, options, and various scenarios that needed to be integrated.

I also led the development of a highly complex full-stack banking application, initially as the sole engineer and later as team lead. The project required integration with third-party APIs such as Visa and Lexis Nexis to conduct sanctions and compliance checks on users across 150+ countries. I was responsible for designing a scalable database architecture, implementing the core backend logic for handling international transactions, and ensuring full compliance with the unique regulations and financial requirements of each jurisdiction.

In addition to the backend, I also built an intuitive frontend interface using Next.js, allowing users to view, search, and filter their transactions with all the necessary details. As the team grew, I took on the responsibility of mentoring new developers, managing task distribution, and maintaining high technical standards across the codebase.


ZebrineX

At ZebrineX we did a prototype project with React, firebaseDB and ExpressJS We worked on a project for handling payments in multiple ways (cash, bank cards, vouchers, etc.) simultaneously for a large German corporation. The technologies we used were: Vue3, Fastify, PostgreSQL, and more.

We had a project where we needed to integrate AI. The idea was to allow the client(IKEA, etc.) to upload various large files (with milions and milions of records - 50M+ records) into the system, from which we would read the content, vectorize it, and store it in a vector database. Then, when the client performed a search, we would return results not just based on keywords but also based on semantic meaning.

As a team lead, I was responsible for the overall architecture of the project, the technical decisions and the implementation of the project. Training and mentoring of the team also was part of my responsibilities.
The approach I proposed was to handle the uploading and reading of these large files (containing millions of records) using Go Lang, leveraging the join-fork model, because it can read files in parallel extremely quickly.Go/Python Hybrid ArchitectureUsing Go for reading large files and then passing the data to Python for vectorization can be better for performance in many real-world scenarios, especially when the bottleneck is in the file I/O and not the ML part.
Read More This Fantastic Article
Then, we would use Python(numpy, pandas, etc.) and its libraries to perform cleaning & structuring of the datasets, vectorization and store the data in the vector database. We build our own models for vectorization and clustering of the data.
As for NodeJS — which we used as the base of the system — the overall architecture was a hybrid application, where each programming language was used for the task it excels at. This idea of using the best-suited language for each operation was key to achieving high performance.

NG Coding / Build Co Lab

At NG Coding / Build Co Lab — a cloud-based software platform for managing construction processes and companies — we were developing a centralized hub that enables construction firms to create schedules, assign tasks, manage contracts, and track progress, budgets, and communication in real time, all enhanced by integrated artificial intelligence.

Initially, we utilized Google Vertex AI StudioAI SDK VercelVertex AI is an umbrella of ML products & services for building, training deploying managing as well as scaling your machine learning workloads.
From Ready-Made to Fully Custom:
GCP ML Options

  • 1. ML APIs
  • - ready-to-use
  • - pre-build models trained on
    google data
  • - no code
  • -
  • 2. AutoML in Vertex AI
  • - predictive modeling on structured &
    unstructured data
  • - hyper-parameter tuning
  • - feature engineering
  • - explainability
  • - no code

  • 3. End-to-end AI with Vertex AI
  • - custom models on pre-build frameworks
  • - noOps, serverless training with
    hyperparameter
  • - explainability
  • - custom code
  • 4. BigQueryML
  • - descriptive & predictive modeling on
    structured data
  • - hyper-parameter tuning
  • - feature engineering
  • - explainability
  • - simple sql code
and pg-vector, but as the platform evolved, we enhanced our technology stack by transitioning to dedicated vector databases and adopting more specialized embeddings, as well as integrating LangChainAI SDK VercelLangChain is a framework that helps developers build applications powered by large language models (LLMs), like ChatGPT. It connects LLMs to external tools (e.g. databases, APIs, search engines), enabling them toreason, plan, and take action based on context. LangChain simplifies complex workflows like question answering, agents, chatbots, and Retrieval-Augmented Generation (RAG), making it easier to create smart, dynamic AI applications. and MCPMCPThe Model Context Protocol (MCP) is a structured framework for designing intelligent systems powered by large language models (LLMs). By splitting an agent into four distinct layers — Model, Context, Tools, and Control Plane—MCP promotes modularity, scalability, and maintainability. to improve AI-driven capabilities.

As a team lead, my responsibilities included assisting and advising on the database architecture to optimize it for high performance.
Database optimization was a key factor in improving the overall performance of the application. This involved integrating additional vector databases, among other enhancements.

I also supported efforts to improve the codebase structure and the performance of the server-side business logic by recommending a migration to FastifyFastifyFastify is a fast and lightweight web framework for Node.js designed for building high-performance APIs. It offers a minimal and efficient core with powerful features like schema-based validation, built-in support for plugins, and optimized HTTP handling. Fastify is ideal for both small services and large-scale applications where speed and low overhead matter. , due to its higher performance characteristics. Additionally, I introduced and enforced several best practices to ensure the code was better structured and more readable — which we successfully implemented.

We had to integrate complex construction-related formulas and workflows, both on the server and the frontend, which we implemented to enable advanced analytics.

Later, I was actively involved in the development of the frontend, where we used the latest versions of NextJS NextJSNextJS is a full-stack React framework for building web applications with features like server-side rendering (SSR), static site generation (SSG), and API routes out of the box. It optimizes performance, SEO, and developer experience by handling routing, code splitting, and image optimization automatically. With support for both client and server logic, NextJS is ideal for modern web apps — from content-rich sites to complex dashboards. and TanStack TanStack - React QueryTanStack Query (formerly React Query) is a powerful data-fetching and caching library for React and other frameworks. It helps manage server state in your app by handling fetching, caching, syncing, and updating data with minimal boilerplate. React Query simplifies complex UI states like loading, error, and refetching, making it easier to build responsive and efficient frontend apps that stay in sync with your backend. . We integrated global state management in both the client and server components — and based on our research, TanStack TanStack - React QueryTanStack Query (formerly React Query) is a powerful data-fetching and caching library for React and other frameworks. It helps manage server state in your app by handling fetching, caching, syncing, and updating data with minimal boilerplate. React Query simplifies complex UI states like loading, error, and refetching, making it easier to build responsive and efficient frontend apps that stay in sync with your backend. was the only solution offering this level of flexibility at the time. In addition, I played a key role in integrating AI features using LangChainAI SDK VercelLangChain is a framework that helps developers build applications powered by large language models (LLMs), like ChatGPT. It connects LLMs to external tools (e.g. databases, APIs, search engines), enabling them toreason, plan, and take action based on context. LangChain simplifies complex workflows like question answering, agents, chatbots, and Retrieval-Augmented Generation (RAG), making it easier to create smart, dynamic AI applications. and MCPMCPThe Model Context Protocol (MCP) is a structured framework for designing intelligent systems powered by large language models (LLMs). By splitting an agent into four distinct layers — Model, Context, Tools, and Control Plane—MCP promotes modularity, scalability, and maintainability. , including the development of agents and working with Google Vertex AI StudioAI SDK VercelVertex AI is an umbrella of ML products & services for building, training deploying managing as well as scaling your machine learning workloads.
From Ready-Made to Fully Custom:
GCP ML Options

  • 1. ML APIs
  • - ready-to-use
  • - pre-build models trained on
    google data
  • - no code
  • -
  • 2. AutoML in Vertex AI
  • - predictive modeling on structured &
    unstructured data
  • - hyper-parameter tuning
  • - feature engineering
  • - explainability
  • - no code

  • 3. End-to-end AI with Vertex AI
  • - custom models on pre-build frameworks
  • - noOps, serverless training with
    hyperparameter
  • - explainability
  • - custom code
  • 4. BigQueryML
  • - descriptive & predictive modeling on
    structured data
  • - hyper-parameter tuning
  • - feature engineering
  • - explainability
  • - simple sql code
.

Throughout the process, I frequently mentored and supported mid-level and senior engineers.


Tech Reserve

The next project I worked on was for a client of Tech Reserve who wanted to develop agents capable of opening and closing trades on the stock market.

Initially, the client proposed creating five separate agents, each making decisions based on data we scraped from a third-party professional portfolio. This would have resulted in five independent, autonomous projects. I optimized the architecture by implementing MCP (Model Context Protocol)MCP LogoThe Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external data sources and tools. Whether you're building an AI-powered IDE, enhancing a chat interface, or creating custom AI workflows, MCP provides a standardized way to connect LLMs with the context they need., and MCP InspectorMCP InspectorThe MCP Inspector consists of two main components that work together:
  • MCP Inspector Client (MCPI): A React-based web UI that provides an interactive interface for testing and debugging MCP servers
  • MCP Proxy (MCPP): A Node.js server that acts as a protocol bridge, connecting the web UI to MCP servers via various transport methods (stdio, SSE, streamable-http)

Note that the proxy is not a network proxy for intercepting traffic. Instead, it functions as both an MCP client (connecting to your MCP server) and an HTTP server (serving the web UI), enabling browser-based interaction with MCP servers that use different transport protocols.

where I designed a single agent equipped with tools that could be invoked in different situations. Depending on the context and need, the agent would dynamically call the appropriate tool.

This approach significantly improved and streamlined the application's architecture, reduced development time, and enhanced code quality by adhering to clean and DRY principles. The client was extremely impressed and pleased with the outcome.

The technologies we used included Vercel's AI SDKAI SDK VercelThe AI SDK is the TypeScript toolkit designed to help developers build AI-powered applications and agents with React, Next.js, Vue, Svelte, Node.js, and more., MCP, and vector databases.

Next Project: Advancing AI Education through LangChain & Agent Architectures

Soft Uni
Software University - SoftUni
As part of my ongoing mission to bridge practical AI with real-world application, my next project involves collaborating with Software University (SoftUni)MCP Logo

SoftUni (Software University) is one of the most prominent tech education hubs in Eastern Europe. It offers intensive, practice-based programs in software development, AI, data science, and digital technologies.

With a strong focus on hands-on learning, real-world projects, and career development, SoftUni prepares students for successful careers in the IT industry.

Thousands of professionals and aspiring developers have launched their tech careers through its structured programs and strong community support.

to design and deliver an advanced lecture series on LangChain and AI Agent Architectures

Dive into LangChain Building AI Agents from Scratch. Seminar organized with the assistance of Software University (SoftUni).

. Building on a successful guest session where I demonstrated reasoning pipelines and tool integrations, this initiative aims to empower developers with hands-on knowledge, foster a vibrant AI community, and share cutting-edge techniques for building intelligent agents.