I have my own theory about this based on scraps of info i've picked up over the years.
Planes have an optimal stage lengths which will vary with payload. Why?
Say one loaded an aircraft to fly 1000nm. Take off is inherently expensive and costs say 10t but cruise costs 1t per 100nm. If you flew 300nm you'd have burnt 13t, so 23nm to the ton. If you flew the 1000nm you'll have used burnt 20t, so 50nm to the ton. So this theory suggest the further you travel and dilute take off costs the better the fuel economy.
However there is another factor to consider, carrying fuel to burn fuel. If you fly for 300nm you would actually only need to load 13t of fuel and the aircraft will be 7t lighter then in the originally. This will work out meaning that you will be burning less fuel in cruise, and ignoring take off costs will improve your fuel economy. So this theory states the shorter the distance the greater the fuel economy.
Now if you superimpose these two factors there will be a point where the stage length means the burn per mile will be at a minimum. And if you move away from this length either way the fuel economy will increase exponentially. This exponential behaviour means when flying a two stage flight it's best to have the tech stop at the mid point even if it means neither sector hits the optimal stage length spot. This way the length from the optimal range will be equal and work out smaller under the exponential behaviour. Eg.
Disparity = 4 = 2+2 or 1+3
2^2 + 2^2 = 8
1^2 + 3^2 = 10
Now the sweet spot will vary with payload so as the 777 was presumably flying near empty it's vastly different to flying with a payload. To be honest it seems like they were more concerned with getting the job done hassle free than costs.
As i've concluded all of this myself some or all of it may be incorrect