1. Home
  2. About Me

About Me - Technical Achievements

The Story

As a child, I was always fascinated by computers. At the age of 20, I decided to pursue a career as a professional software developer. I began taking C++ courses and passed the Plovdiv University (PU) entrance exam with a score of 5.40. However, shortly after that, I moved to Sofia and started working at a company called Freido, where I initially worked with PHP.

Later, I joined StangaOne, where we actively used the Symfony Framework, along with Doctrine & Propel ORM, for backend development. Of course, we also worked with HTML, CSS, and JavaScript. In addition, we integrated various payment methods and implemented numerous other features.


During my 20 years of experience, I have worked for various companies both as a full-time employee and a part-time contractor. Throughout my career, I have developed my skills and gained a deeper understanding of the industry.

I have collaborated with various Bulgarian companies, such as:

As well as international ones, such as:



StangaOne

At StangaOne I have worked on various and complex applications in fields like banking system, e-commerce and etc. We integrated:

  • 🔥 different payment methods and API's
  • 🔥 unique and attractive UI's
  • 🔥 complex server-side & frontend logic
Later on — about a year ago or some time back — they contacted me again, and we collaborated on a project for a large German corporation. The application was relatively large, with many internal services and a lot of legacy code.
It included EDA (Event-Driven Architecture) features such as Kafka and RabbitMQ .
As a senior developer, I was part of a team responsible for developing a split-payment service. For example, when you're at checkout and want to pay for your products, the system allows you to pay part of the amount in cash and the rest with a debit card. In the end, the total is calculated, and the bill is settled.
Some of the technologies we used included Vue 2/3 , Fastify , Kafka , PostgreSQL , TypeORM , and others.


Diversity.com

One of the most interesting projects I have worked on is Diversity.com — a job posting portal for minorities in the United States.

After presenting several ideas and developing the first version of the website, the client was highly impressed with my work and style and promoted me to the role of Technical Manager. They handled the sales aspect, while I was responsible for the technical side of the business, including website development, tools, team, and online marketing.

To support the project's growth, we needed to expand the team by hiring additional programmers, designers, and marketers. Once we assembled the team, the business took off significantly, generating a monthly profit of approximately $100,000.

A large part of this success was due to my marketing strategies ! One of the biggest achievements was getting the website indexed as the #1 result on Google for the keyword "diversity" — a remarkable milestone, considering how extremely competitive the U.S. market is for top-ranking keywords on Google. This happened around 2017.

I learned how to create websites that appear on the first page and even in the first position on Google.

Responsibilities:

  • 🔥 Developing complex applications, integration of third party API's, payment methods
  • 🔥 Managing marketing strategy
  • 🔥 Managing team of 4 specialists
  • 🔥 Communication with stake holders, planning, goal achieving strategy
  • 🔥 High Level Sytem Design and Architectures
Some of the technologies we used included: PHP - Symfony Framework, MySQL ,JavaScript - AJAX, jQuery, HTML, CSS, SEO (Search Engine Optimization).

Interesting case:

While we were developing this application, there were many additional projects and technical challenges that I had to solve.

One particular case study involved a database with several million records (100M+ records) and daily traffic of 50,000 – 70,000 visits on the homepage. The challenge was to implement a slider that retrieves results from the database via AJAX and displays them on the homepage, all while ensuring a load time of under 1 second (based on SEO requirements).

Simply optimizing the database query was not enough — a non-standard solution was required!

I came up with an effective approach: I created a cron job that selects 250 job posts from millions of records. This cron job runs every six hours and stores these 250 job posts in a JSON file. The slider then randomly selects 60 records from this pre-generated JSON file, ensuring fast page load times.

After Diversity.com, I started activly working with   NODEJS (ExpressJS),   REACT (Nextjs),    TYPESCRIPT and maintly JS technologies. My main tech stack at the moment.



Facebook AdsI was also approached by individuals from minority communities in Bulgaria who were consistently facing issues with Facebook’s ads approval and account restrictions. Due to automated moderation systems often flagging their content unfairly, they sought my support in running their campaigns. I provided strategic guidance and technical support to help them navigateFacebook’s ad policies, including optimizing ad creatives(at one point, we managed to optimize one of the campaigns to cost just $0.001 per click — one of the lowest CPCs we’ve ever achieved), managing compliant ad accounts, and, where necessary, using legitimate workarounds to ensure their voices could be heard. This experience reflects my ability to find ethical, practical solutions in challenging digital environments and demonstrates my commitment to supporting underrepresented communities online.



Web3Another interesting project I worked on was for InfiniGods.com, a WEB3 gaming platform where users can play, earn NFTs , and trade them. The project was originally outsourced to an external agency, but their implementation was poorly done — it used a hybrid setup: PHP for the backend and an outdated version of React (v16) for the frontend. As expected, this caused multiple issues, especially since PHP didn’t support web3 features natively, while JavaScript had already matured with tools and libraries like ethers.js.

As a senior developer, my responsibility was to take over the entire project from the external team, set up and manage a new Linux-based VPS server, and rebuild the application from scratch using NEXTJS (withTypeScript) and ExpressJS for the backend. This new architecture allowed us to implement essential business features properly, including MetaMask wallet integration , Ethereum smart contract communication, and other blockchain-related functionality.

It was also the first time I learned how to interact with smart contracts written in other languages (e.g., Solidity ) directly from JavaScript code, which was a powerful and exciting experience. I gained solid hands-on knowledge of web3, especially around wallet integration, blockchain interaction, and decentralized application (dApp) infrastructure.

This project greatly expanded my expertise in both modern full-stack development and the web3 ecosystem, and I’m truly grateful for the opportunity to work on such cutting-edge technology.
Checkout my simple personal demo with web3 project



We Craft Media
I worked with We Craft Media on a project where we had to build a React & TypeScript interface for managing electric vehicles for one of their clients. The backend was built with ExpressJS. As a senior developer, I was responsible for the frontend, and one of the ideas I proposed at the time was to use Redux Toolkit (RTK) , as it was quite advanced for its time and worked very well with the principle of optimistic updates , among other things. This significantly improved the app's productivity. The application itself was complex, as managing electric vehicles involved many features, options, and various scenarios that needed to be integrated.



bank

FinTech Project

I also led the development of a highly complex full-stack banking application, initially as the sole engineer and later as team lead. The project required integration with third-party APIs such as Visa and Lexis Nexis to conduct sanctions and compliance checks on users across 150+ countries. I was responsible for designing a scalable database architecture, implementing the core backend logic for handling international transactions, and ensuring full compliance with the unique regulations and financial requirements of each jurisdiction.

In addition to the backend, I also built an intuitive frontend interface using NEXTJS , allowing users to view, search, and filter their transactions with all the necessary details. As the team grew, I took on the responsibility of mentoring new developers, managing task distribution, and maintaining high technical standards across the codebase.

Technologies used:  NextJS, TypeScript, Oracle DB, MongoDB, Fastify, LexisNexis API, Visa API


ZebrineX

At ZebrineX is an outsourcing company; I was placed with different clients on different projects. With the first, we built a prototype using React, Firebase, and Express.js.

At another client—a Romanian company—we needed to integrate AI. The goal was to let clients (IKEA, etc.) upload very large files (millions of records—50M+) into the system. We would read the content, vectorize it, and store it in a vector database. Searches would return results using both keyword matching and semantic meaning.

As team lead, I owned the architecture, technical decisions, and implementation. Training and mentoring the team were also part of my responsibilities.
I proposed handling upload and ingestion of these large files (millions of records) in Go, using a join-fork model so we could read files in parallel very quickly. Then, we would use Python (numpy, pandas, etc.) and its ecosystem to clean and structure the datasets, vectorize them, and store the results in the vector database. We built our own models for vectorization and clustering.
Node.js was the backbone of the system: a hybrid stack where each language was used where it fit best. That was how we kept performance high end to end.

Technologies used: React, Firebase, Express.js, Go, Python (NumPy, Pandas, PyTorch, scikit-learn, Hugging Face Transformers, Matplotlib, FastAPI)


NG Coding / Build Co Lab

At NG Coding / Build Co Lab — a cloud-based software platform for managing construction processes and companies — we were developing a centralized hub that enables construction firms to create schedules, assign tasks, manage contracts, and track progress, budgets, and communication in real time, all enhanced by integrated artificial intelligence.

Initially, we utilized Google Vertex AI Studio and pg-vector, but as the platform evolved, we enhanced our technology stack by transitioning to dedicated vector databases and adopting more specialized embeddings, as well as integrating LangChain and MCP to improve AI-driven capabilities.

As a team lead, my responsibilities included assisting and advising on the database architecture to optimize it for high performance.
Database optimization was a key factor in improving the overall performance of the application. This involved integrating additional vector databases, among other enhancements.

I also supported efforts to improve the codebase structure and the performance of the server-side business logic by recommending a migration to Fastify , due to its higher performance characteristics. Additionally, I introduced and enforced several best practices to ensure the code was better structured and more readable — which we successfully implemented.

We had to integrate complex construction-related formulas and workflows, both on the server and the frontend, which we implemented to enable advanced analytics.

Later, I was actively involved in the development of the frontend, where we used the latest versions of NEXTJS and TanStack . We integrated global state management in both the client and server components — and based on our research, TanStack was the only solution offering this level of flexibility at the time. In addition, I played a key role in integrating AI features using LangChain and MCP , including the development of agents and working with Google Vertex AI Studio .

Throughout the process, I frequently mentored and supported mid-level and senior engineers.

Technologies used:  Node.js, TypeScript, MCP, MCP Inspector, LangChain, PostgreSQL, Google Vertex AI Studio, GSP, NextJS, TanStack Query, Python - NumPy, Pandas, PyTorch, Scikit-learn, Hugging Face Transformers, Matplotlib, FastAPI


Tech Reserve

The next project I worked on was for a client of Tech Reserve who wanted to develop agents capable of opening and closing trades on the stock market.

Initially, the client proposed creating five separate agents, each making decisions based on data we scraped from a third-party professional portfolio. This would have resulted in five independent, autonomous projects. I optimized the architecture by implementing MCP (Model Context Protocol), and MCP Inspector where I designed a single agent equipped with tools that could be invoked in different situations. Depending on the context and need, the agent would dynamically call the appropriate tool.

This approach significantly improved and streamlined the application's architecture, reduced development time, and enhanced code quality by adhering to clean and DRY principles. The client was extremely impressed and pleased with the outcome.

The technologies we used included Vercel's AI SDK, MCP, and vector databases.


Technologies used:  Node.js, Express, TypeScript, Vercel AI SDK, MCP, MCP Inspector, GSP, PRISMA, MongoDB, Qdrant, Python - NumPy, Pandas, PyTorch, Scikit-learn, Hugging Face Transformers, Matplotlib, FastAPI


Next Project: Advancing AI Education through LangChain & Agent Architectures

Soft Uni
Software University - SoftUni
As part of my ongoing mission to bridge practical AI with real-world application, my next project involves collaborating with Software University (SoftUni) to design and deliver an advanced lecture series on LangChain and AI Agent Architectures . Building on a successful guest session where I demonstrated reasoning pipelines and tool integrations, this initiative aims to empower developers with hands-on knowledge, foster a vibrant AI community, and share cutting-edge techniques for building intelligent agents.

Technologies used:  LangChain, TypeScript










Akkodis
Akkodis

The next company I worked for was Akkodis. I was part of a relatively small team of specialists working on a platform for monitoring and managing freight vehicles.

Right from the start of my work, I noticed that the application was built on outdated architectural and technological foundations, despite the project being developed relatively recently (2-3 years, or at least that's how long the company has owned it). During this period, no substantial steps were taken for modernization, and an established but already inefficient development line was being followed.

I approached the work with full commitment. In a standard two-week sprint, where most colleagues completed around 10-15 story points, I started completing around 33 points, and in the next sprint - 22 points, alongside side other architectural and optimization tasks. I consistently exceeded the expected quota, which clearly showed a high level of diligence and efficiency.

One of my key contributions to the project were the architectural proposals I began actively sharing. I directed attention to applying Clean Architecture & MVVM,  DRY principles,  as the current codebase often lacked separation of concerns. In one file, you could see business logic, UI logic, and CSS all gathered in one place, with some components reaching over 1000 lines of code, without clear structure and organization.

However, my most significant idea was the introduction of  Optimistic Updates - a fundamental principle for modern frontend applications. After detailed analysis, it was determined that its implementation would eliminate a significant number of unnecessary requests to the server (approximately over 100). Instead of operations like creating or editing objects taking an average of about 5 seconds (which is completely unacceptable for a modern application), they would execute instantly. Additionally, reducing the number and volume of requests also led to substantial reduction in hosting costs, making the system more efficient and economically optimized.

Overall, I established myself with high productivity, as well as with a number of architectural and optimization ideas aimed at better maintenance, scalability, and efficiency of the project. However, at a certain point I made the decision to leave mostly because of the toxic team environment and management.

Technologies used:  React, Node.js, GraphQL, AWS, OpenSearch, TypeScript,
Python - NumPy, Pandas, PyTorch, Scikit-learn, Hugging Face Transformers, Matplotlib, FastAPI

Reasons to Leave AKKODIS



Agents - AI First Delivery

Agents - AI First Delivery
Agents - AI First Delivery

I was also working on an idea aimed at increasing my productivity by integrating AI more and more into my workflow. At some point, I realized that I could maintain multiple copies of the same codebase and run several agents in parallel, each working independently on the code—fixing bugs or developing new features. In this setup, my role would mainly be to review the generated code and apply changes where necessary.

Achieving this requires a deep understanding of the codebase, very well-defined instructions (rules for the AI) provided upfront to the agents, and strong software engineering expertise to ensure the code is written correctly and with high quality. Ultimately, everything still depends on the software engineer, while the agents serve only as a way to speed up the process and boost productivity.

Every client wants things done faster and more cost-effectively, but never at the expense of quality—and this balance can be achieved through proper agent management.

This does not mean that programmers will become obsolete or lose their jobs—on the contrary, skilled specialists will always be in demand. However, there will be a shift. Just like when the assembly line was introduced, some professions disappeared while new ones emerged in the market. It’s all a matter of adaptation.