Author: admin

  • Day 62 – Laravel + Frontend Tooling + Random Thoughts

    Today, I was looking at making a final decision on whether to use a React or Vue frontend. My prototype has been coming along, but I had just been using plain PHP blade templates with Vanilla Javascript for quickness.

    I think web development has got totally overcomplicated, and that you can go a long way with vanilla stuff. I would never start a PHP project without Laravel however, since I know it has virtually everything I need to build whatever I want.

    Since LLMs have come about, there’s a lot to be said for just writing functionality in javascript; but ultimately you do want to benefit from a frontend framework. State management, DOM manipulation and an event bus are reasons why I want to use them. And of course, components. Was a shame the official WebComponent standard is so poor.

    React Or Vue? Or Livewire?

    The age old debate for a developer. It’s annoying quite frankly at this point to continually decide what to use. I prefer the simplicity of Vue syntax, but I often use graduates from a bootcamp that teaches React, and I have bought quite a few premium templates that React also. To be honest though, with LLMs it’s so easy to switch between.

    Then there’s LiveWire. Each time I’ve tried to get into LiveWire I’ve been put off by it. But I had the same resistance when I got into Flutter and once I persevered I loved it.

    In order to decide on the choice, I looked into Filament.

    Filament

    Filament is TALL stack (Tailwind, Alpine, Laravel, Livewire) … and i’ve used previous versions being fairly impressed. This would be a bit of a few steps back in order to take a leap forward, but taking a look at the excellent Laracasts for it, I’m starting to see the benefits.

    • The resource creation gives the CRUD UI upfront … this is slightly different from the CRUD UI generation that I have been working on… since my ones are more ‘on-the-fly’ than ones that get defined in the yaml files. Still the resource creation results in a full on CRUD interface with sortable datatables out of the box.
    • The form library is pretty much flawless. No more messing about with forms.
    • Select options can be configured with enums and gets all enforced really nicely.
    • Multi-tenant stuff out of the box
    • Can hook up with Laravel Stripe for subscriptions

    Laravel Herd

    • I’ve been a fan of Laravel Sail for a while, but that’s only because it abstracted complication away from me. Whilst I get tired of learning new things, that’s no reason to not embrace something that works even better. Enter Laravel Herd. Some benefits:
      • I don’t have to use Docker Desktop. Which decides to hog a huge amount of system memory. You can change this in settings, but overall Herd is a lot faster.
      • For some reason, my local composer would not install the laravel CLI tool to my local machine. So I had to resort to installing Laravel via some shell scripts. Herd got it installed immediately
      • Herd includes TinkerWell, some debugging tools, and general system configuration

    Consultancy Fees

    I’m getting tired of being asked to ‘look into stuff’ and expecting it to be done for free. Need to have a consultancy page hooked up to a simple credit card thing.

  • Day 61 – Prototyping

    I’ve been using my very basic prototype to store data of ‘things’ in my life – I use LLMs to setup this data; and then I have a UI that allows me to interact with the data.

    One of the things I’ve been doing with it is keeping track of projects, ideas and tasks … and the problem I started to face was I didn’t like the UI I had used as the index for these data items.

    So last week I’d spotted a nice slider effect that I liked; and today I decided to put time into my own project for once again; and just implement that.

    So now I still have a CRUD interface for my data, but I’ve also integrated the beginnings of a nice swiper interface so that I can swipe through each ‘thing’ one at a time; consider it one at a time; and then move onto the next.

    I made an animated gif to briefly showcase it. Since I’m running out of time today, I’ll leave it here … but this is currently cycling through my blueprints … it’s a good start and going to be a much more enjoyable way of cycling through the things going on.

    And of course eventually the images will be generated according to the content.

    More updates tommorow.

  • Day 58 – 60- Quick Catchup & MCP -Model Context Protocol Revolution

    It’s been a few days since I posted last. I’m going to let myself off, but resolve to do better in the future – since I do want to keep posting on a daily basis but I am aware I want to make the posts more valuable; so I feel they represent value that can then go onto LinkedIn, and the other social networks.

    Things are definitely building in my mind; and projects are coming along. However, still a long way from anything concrete taking off.

    For the moment would just like to talk about MCP, because it really is a very significant milestone in the AI world. (Image context from OpenAI was another one recently).

    MCP

    Model Context Protocol was announced in Nov 2024 by Anthropic.

    This is a big step forward since it has standardised how language models can/should interact with external tools.

    In order for LLMs to be further useful, they need to be able to action things.

    I’m guessing that once Anthropic had integrated a couple of tools with their LLMs they realised it would be better to have a standard way of doing that … and from that they came up with the MCP architecture.

    At first understanding, giving a language model a standard way of interfacing with the outside world is a bit of a game-changer and there are already a ton of integrations that we can use straight away. Things are happening really fast.

    Core Architecture

    Essentially, you have an application (called the Host), and this has a Client inside it.

    The Client maintains connection to the Servers which have access to the tools you want the LLMs to connect to.

    Protocol Layer

    It’s fairly simple … you have:

    • Requests
    • Notifications
    • Results

    You have functions that:

    • Handle incoming requests
    • Send requests and await response
    • Handle income notification
    • Send a notification

    Transport Layer

    All transport layers use JSONRPC

    JSON-RPC Request Object

    • jsonrpc : always going to be “2.0”
    • method
    • params (optional)
    • id

    Without an ID, it is considered a notification that doesnt expect a response. In fact, servers MUST NOT reply to a notification.

    JSON-RPC Response Object

    • jsonrpc (version string)
    • result
    • error
    • id

    To be clear, MCP is a standard way – a proposed specification that multiple parties can agree on – for how language models will interact with outside tools.

    Protocols are vital in tech. We have TCP-IP and email, which you use all the time. Without that agreement, we could have had an incredibly fractured internet.

    When companies and developers can agree to do things in a certain way, it makes it easier to make systems.

    Using MCP I assume is very much like working with interfaces. If you code an LLM up to work with your own tools and then for whatever reason you decide to switch LLMs … then using a protocol would mean there’s no lost work – because the new one will use exactly the same interface as the current one.

    Reference

    4 Request object

    A rpc call is represented by sending a Request object to a Server. The Request object has the following members:jsonrpcA String specifying the version of the JSON-RPC protocol. MUST be exactly “2.0”.methodA String containing the name of the method to be invoked. Method names that begin with the word rpc followed by a period character (U+002E or ASCII 46) are reserved for rpc-internal methods and extensions and MUST NOT be used for anything else.paramsA Structured value that holds the parameter values to be used during the invocation of the method. This member MAY be omitted.idAn identifier established by the Client that MUST contain a String, Number, or NULL value if included. If it is not included it is assumed to be a notification. The value SHOULD normally not be Null [1] and Numbers SHOULD NOT contain fractional parts [2]

    The Server MUST reply with the same value in the Response object if included. This member is used to correlate the context between the two objects.

    [1] The use of Null as a value for the id member in a Request object is discouraged, because this specification uses a value of Null for Responses with an unknown id. Also, because JSON-RPC 1.0 uses an id value of Null for Notifications this could cause confusion in handling.

    [2] Fractional parts may be problematic, since many decimal fractions cannot be represented exactly as binary fractions.

  • Day 57 – The Habits Ahead

    Strong results come from doing the right thing many times over.

    Consistency. Discipline.

    I’ve got multiple projects going on at the moment and they can easily fill up my days entirely, without time for the things that will necessarily continue to build the business. But I am dropping things that will actually continue to build the business.

    These are a good additional starting point to my existing habits:

    • Grant and funding research & application
    • Event, conference and networking research
    • Email marketing pipeline
    • Website marketing
    • Social media marketing

    It’s a bit like programming yourself for succeeding; you decide what actions the business would benefit from; or what areas need to be attended to consistently.

    These become the KPIs of your business.

  • Day 56 – Pandora’s Box.

    I think the way the world is right now, people need to feel like there is a new wave of ‘something good’ … and whilst AI is certainly going to take many jobs, it opens up a whole world of possibility for those who are prepared to learn about it, work hard and creatively use it. So there has been this huge positive wave of energy (certainly amped up with money) toward innovation. It may well be that LLMs have limitations, and that we are witnessing a bubble… but people are now TRYING new things. Technical and non-technical people are finding they can do a whole lot more using AI. What I mean is, a lot of crazy ideas are being unlocked. Pandora’s box has been opened.


    True Personal Assistants

    For me, it gives me the opportunity to build a personal assistant of the magnitude that i’ve wanted to few decades. I think the science fiction movies and video games influenced me in wanting to build these personal assistants.

    Examples are

    JARVIS Assistant From Iron Man Movies

    J.A.R.V.I.S., which stands for “Just A Rather Very Intelligent System,” is Tony Stark’s natural-language user interface computer system, named after Edwin Jarvis, the butler who worked for Howard Stark and the Stark household.18 Initially, J.A.R.V.I.S. was developed as a simple AI assistant to control Stark’s Iron Man suit, but it evolved into a powerful AI capable of managing various tasks and assisting Stark in his personal and superhero life.51 J.A.R.V.I.S. uses advanced natural language processing and communication skills to understand and respond to Stark’s commands, making it more than just an AI tool—it’s a trusted companion.

    So, the ‘trusted companion’ thing here is the key.

    Cortana from Halo

    As an artificial construct, Cortana has no physical form or being. Cortana speaks with a smooth female voice, and projects a holographic image of herself as a woman. Cortana is said to resemble Halsey, with a similar attitude “unchecked by military and social protocol”. In Halo: The Fall of Reach, Cortana is described as slender, with close-cropped hair and a skin hue that varies from navy blue to lavender, depending on her mood.[6]: 216  Numbers and symbols flash across her form when she is thinking.[9] Halsey sees Cortana as a teenage version of herself: smarter than her parents, always “talking, learning, and eager to share her knowledge”.[6]: 218  Cortana is described as having a sardonic sense of humor[10] and often cracks jokes or wryly comments, even during combat.[6]: 217 

    There’s a few more but that’s enough for now

    So, the key components are:

    • LLMs (Language Models)
    • Decision Trees
    • GenerativeUI
    • Automation flows
    • Data Mining & Analysis
    • Context and Object Oriented Memory
    • MCP

    In the future I imagine, the internet has pretty much been abstracted from us. Which isn’t a great thought for most of us, but it’s kind of a natural progression. Things change, for instance it was naive of me to think that I could have another 20 years of making money doing the same thing until I retire i.e programming.

    At school, I had a maths teacher who used to program in assembly. Talk about talent… in a few decades will we have any humans that know how to write in assembly? I always wondered why he didn’t still program, and it was probably because he didn’t move on.

    In the same way, those of us who understand HTML/CSS/JS on a deep level … these skills are fast becoming abstracted away with vibe coding apps. The vibe coding apps that combine with Supabase are going to get better and better; but will the prediction machine ever get so good that it can really understand programming so that it makes it super clean (probably yes, as our prompt engineering gets better to help it). Eventually, the personal assistants will replace any requirements that vibe coding creates now.

    The next big waves/industries to come along will be more widespread autonomous machines (drones, robots, cars), surveillance and wide scale sentiment monitoring; then beyond it some sort of biohacking industry specifically with longevity at the core; video games will have a resurgence in originality; so I do hope that something will replace the employment drop-off that will shortly happen.

    That’s enough thoughts for today.


    Work Updates:

    • I continue to work on a side project to bring in a small amount of money but enough to cover personal costs. It’s a fun, high potential project, and made me realise I do like working on IT dev projects, and am quickly improving at the management of them.
    • The DXP project is slowly getting to the beta launch was is great. Once that’s hit, I will share more details.
    • My own platform is coming along. Shown it to a few people and I know these people wouldn’t sugarcoat stuff, it’s super days but reasonably positive. Just got to keep going.
    • The ideas that would drive an AI Agency are coming together; but still a little way off from doing that
    • Continually trying to keep up with the industry
    • More and more aware of the necessity of marketing for everything

  • Day 55 – The Founder’s Story

    The days roll on.

    Today was a day of calls.

    Today taught me one thing and that was that the act of posting everyday actually has better consequences than just post engagements on LinkedIn.

    I was doing a few interviews today – graduates from TechEducators software development bootcamp … and a few of them had read through my posts on this site and LinkedIn. It was interesting to hear that they had done that, since I pretty much just write for myself at the moment.

    But I realise that the story that you make when starting something is actually important because, if they choose to, people can gain context on what you are doing. It’s not just about flawlessly sanitized marketing … founders can benefit from just being genuine and not being afraid of not being seen as being perfect. You just have to not care about that aspect.

    Anyway, it’s been challenging to do a post on an almost daily basis, but along the way you realise that it’s having unintended positive consequences. Which is great.

    Improvements to my story marketing

    I’m going to need to scale up with a bit of automation into other social networks; and do more videos. I want to focus first on the core original content that I create; and then look at what I can use AI to supplement that. But I need to scale into video content focused more on AI value, and also founders story.

    IT Projects

    I’m working on another project which will remain nameless. It’s a tricky one since it’s a great project with great potential; but it suffers from the very common IT project pitfalls – most of which I have fallen into in my career:

    • At the beginning of the project, it’s OK to do some R&D prototyping to see what’s possible; and it’s OK to use agile … but sometimes you do actually need to plan through what the end product MUST do. When you are replacing a system or starting a non-trivial one from scratch … you can’t just wing it. You must be specific about what you’re building and more importantly HOW you are going to get there. Make the decisions AHEAD of time. This is a massive skill and vibe coding people will never do this… we will see how it turns out over the long term.
    • During the project you need to communicate; as soon as that starts breaking down, things slow down. And the communication needs to be kept in places other than emails or messaging. Find what works for your team.
    • When things start getting a bit tricky, bring in specialist help. I can’t emphasise this enough. Whether that is a talented project manager developer, or a developer with better skills at your tech stack… don’t hesitate. You need someone sooner than you think.
    • DOCUMENT! When projects are several years down the road; onboarding new developers is slow since they have to figure it all out from scratch.

  • Day 54 Building an AI Agency

    Today’s post is a bit more of a strategic analysis of the situation.

    • There is a clear demand for more information on AI, across the board – from SMEs to enterprise.
    • AI Agencies are a fairly new phenomenon, and existing agencies will eventually realise they need to pivot to offer this service.
    • There is too much information out there regarding AI, and everyday I find a new tool, or new bit of news that blows me away. For the person who is not focused fully on AI, they don’t have the time or inclination to full match that.
    • Whilst there is an abundance of information, how much of it is actually relevant and actionable? A lot of people are now just getting their information from language models, coming up with strategies that sound good on the surface, but actually don’t work in practice.
    • Apart from people who have been at it for a few years, most people are nowhere close to understanding how to actually use AI to build/grow their business. They know there’s something that can help, but they aren’t sure specifically what that is.

    So, it’s clear to me that an AI Agency fulfils a demand that most companies are going to realise they need fairly quickly; and over the last fifty days or so I’ve figured out more or less that AI for business is about:

    • Being able to gather meaningful data from the internet in an automated fashion; using web crawling techniques blended with AI
    • In tandem with data gathering, AI can help with lead generation and create workflows that help a customer through their journey with companies. LLMs can be used to tailor emails to send out meaningful sales emails to people
    • Having a system driven by AI that oversees the entire business operations
    • Virtual Agents and Chatbots that act as the first point of contact for people using either websites or going through other ecosystems like whatsapp
    • Using automation tools to reduce time spent on manual tasking. Saving even five minutes per week on a regular task is roughly 4 hours per year saved. So over ten years for a company it’s the equivalent of a single employees working week.
    • Driving the content strategy with automation and generated content to supplement current marketing efforts
    • Able to develop digital features faster using AI code helpers and ‘vibe coding’
    • Able to talk to their data more effectively and gain insights

    It’s also clear to me that I still really enjoy, and am good at, development non-AI systems – becoming better at managing the projects of them – and that most of these will eventually be required to integrate with AI.

  • Day 53 – Laravel

    Despite:

    • SAAS being completely overpopulated with products
    • Agents being likely to take over most software
    • Automation tools likely to remove the need a lot of tool chaining dev

    Despite all this, I still want to build my own platform. Who knows where it will lead to.

    I had started to do this already but in the process:

    • Laravel upgraded to 12
    • Laravel Jetstream was superseeded by the new starter packs
    • Laravel Spark wasn’t suitable for Stripe integration since I wanted to use Stripe tax

    Before I go much further with the dev I’m going to strip back and start from the react starter pack.

    I’m comfortable with either Vue or React; but I’m going react mainly because I am going to integrate some stuff from Flowbite Pro.

    How To Install Laravel 12 Starter Kits with Laravel Sail

    I know there’s Laravel Herd these days, but i’ve got used to Sail and I like using docker. I don’t see a reason to change. But I needed to find a manual way of installing Laravel Sail straight off because my local composer was all messed up.

    Install.sh

    #!/bin/bash
    
    # Check if a project name was provided
    if [ -z "$1" ]; then
      echo "Usage: $0 <project-name>"
      exit 1
    fi
    
    PROJECT_NAME="$1"
    
    docker run -it --rm \
        --user "$(id -u):$(id -g)" \
        -v "$(pwd):/app" \
        -w /app \
        -e COMPOSER_HOME=/tmp/composer \
        laravelsail/php84-composer:latest \
        bash -c "composer global require laravel/installer && /tmp/composer/vendor/bin/laravel new \"${PROJECT_NAME}\""

    You can then call this with:

    sh install.sh desired_project_name

    This scripts will install Laravel from a docker container with composer in it. Then cd into your new directory and run a second script to install sail into it. I like how this step lets you select DB type because previously I had to mess around with docker to get it working.

    Install2.sh

    docker run -it --rm \
        --user "$(id -u):$(id -g)" \
        -v "$(pwd):/app" \
        -w /app \
        -e COMPOSER_HOME=/tmp/composer \
        laravelsail/php84-composer:latest \
       php artisan sail:install

    Big thanks to guy on reddit

    You can then use:

    • sail up -d
    • sail artisan migrate
    • sail npm install
    • sail npm run dev

    Has Laravel Gone Backwards With These?

    They’ve pissed off a lot of people in the community who were relying on JetStream. You can still use it but it looks like Laravel lot are going to force it out over the long term.

    There’s:

    • No teams setup whereas JetStream had it
    • JetStream had 2FA
    • JetStream had a better profile settings page

    One thing I’ve noticed is that there are no teams integrated into it, so it’s sort of a step backwards from JetStream from that perspective. However, it’s not a huge problem.

    Good Opportunity For Data Export Feature

    The main premise of the platform is keeping your data intact and organised. Because it’s a SAAS system at its heart, technically the data is on own servers, but we would offer a solution where you can have your own server and have full control over that.

    I’ve never liked apps that don’t let you export your data nicely, so I’ve been using this Laravel 11 platform for a few things, and I’ll want to bring the data over… so a new export/import feature is imminent.

  • Day 52 – ‘Flowgramming’ as the new threat and opportunity to programmers

    So, the new era of the internet is going to be connecting the dots. In this paradigm, the dots are the ecosystems that already exist and have many users.

    These ecosystems have had millions of dollars poured into them. They have somewhat of a technical moat now but their user ecosystem widens that moat most of all.

    For instance, I have been meaning to get started on N8N for a few weeks now. I’m glad I sat down and made time for it today … this is a brand new software type available to people without having to pay for the upfront development of handling the communication between the dots.

    As a starting point I setup a workflow that grabbed my favourite Arsenal FC blog, asked OpenAI to summarise it, and then message me them on slack.

    After some fiddling around I got some messages in Slack!

    I can’t overemphasise how great this is. This is a huge timesaver, but it’s yet again another threat to programmers.

  • Day 51 – Vibe Coding

    Vibe coding is really taking over. Simply put, non-technical people with ideas can now use AI to make very high quality prototypes.

    Those prototypes can do most of the things that are needed for a company. However, in almost all cases, the code generated to run them is somewhat bad (from a software engineer process) … and if you keep going down that route without regular refactoring and tidying up, eventually you will get technical debt.

    That technical debt may come three years down the road though, at which point you will have to hire an experienced programmer. If you are still going in three years though you are likely to have money by that point and can afford the programmer. So that’s the payoff!