2 hours to stabilize at 1975 degF seems like a long time.
The last couple of HT ovens I built were 230V, 3000 Watt (the UK uses 13A-fused domestic plugs and 230V nominal mains Voltage, so the elements were sized for around 12.5A on 240V: 3 kW. The rationale being that there is some tolerance on mains Voltage and popping a fuse in the middle of a HT session would be worse than a slightly slower ramp-up to temperature). The chambers were 28" long, 7" wide and 6" tall.
From my initial testing notes: "800 degC was reached in 22 ½ minutes, 1100 degC took 54 ½ minutes, the temperature at an hour was 1125 degC and 1177 degC (2150 degF) took 71 minutes."
Assuming the oven runs at full power throughout is the "safe" thing to do when working out the power cost.
If you preheat the oven, it will draw full power during the heat-up process. Then it will cycle the elements on and off to maintain temperature.
The controller I was using could display the percentage output cycle. From memory, for those ovens, it was between about 15% (at 800 degC, 1472 degF) and about 40% (at 1177 degC, 2150 degF) of "on" time during the output cycle to maintain the setpoint with the oven closed: between 450W and 1200W during the hold period.
Batch-treating 5 or 6 stainless blades would probably use less than 5 kWhr all-told, even pulling them out one-at-a-time for plate-quenching and letting the oven temperature re-stabilize each time. I find it pretty difficult to envisage a scenario in which the cost of the power to run the HT oven approaches the cost of the HT foil used.