The simple way is a resistor, sized appropriately based on your current draw. That could waste a fair amount of power if you're drawing significant power from your converter. If you're drawing 1A at the 12V output (12W), that would be about 250 mA at 60V input allowing for some converter losses. A resistor to drop that from 100V would dissipate 10W. I guess that's not terrible, but now you're drawing 25W from your pack to output 12W, so just under 50% overall efficiency.
You could also rig something using a 3-terminal voltage regulator like an LM317 or a zener diode/resistor combo. Those would both have similar power dissipation to the resistor-only option, just a little better stability of the output voltage for varying current. Which is better depends on whether you have a relatively constant load on the 12V side or a highly variable one.
Also, if applicable, make sure you consider the scenario when you don't have any load on the 12V output. If you only use a resistor to drop the voltage and no current is flowing, the input to the converter will rise to near battery voltage and you might pop something. A zener/resistor combo would fix this but drain some power even when you're not using 12V devices, so a cutoff switch might be in order.
Hope that helps. I can give more specific suggestions if you like, provided some more info about your application and priorities.