I need a formula that returns the cost of running the AC per day and then I plan to use this formula to project into the future to find the cheapest way to run the AC for 5 months.
Parameters that must be included:
Duration of cooling run, i.e. ran the AC for 5 hours
Rate of temperature rising again due to not running the AC (when it is in standby mode; this is due to the weather outside being hot)
Temperature outside
Temperature indoors
Cost per minute of run time
Any extra startup cost that might be present in starting up the AC (as opposed to running it continuously)
Target indoor temperature (71 degrees Farenheit)
This is tricky because the cost of cooling down the house might be cheaper if you just keep the AC fixing small variations in temperature (i.e., when it rises to 72 degrees Farenheit, run the AC to reduce it back to 71 degrees Farenheit) as opposed to turning it off for 8 hours and then running it for 4 hours to reduce it from 80 degrees Farenheit to 71 degrees Farenheit.
I can provide more info as needed, but I think it might be fine if this formula is kept in terms of arbitrary variables.
You need to collect data. Presumably the power consumption is (about) constant when the AC is running, so you can measure that and look at your electric bill for the cost per kWh. I would be surprised if there is any significant startup cost. You can do your measurement when the house is hot so the AC runs continuously. A simple model is that there is a certain heat leak in Watts into the house which is proportional to the temperature difference between inside and outside and that the AC can pump a certain number of Watts out when it is running. Sun or wind may increase the heat intake of the house. The amount the AC can pump may reduce as the temperature difference increases. For various temperature differentials, measure the rate of temperature rise inside with the AC off and the rate of temperature fall with the AC on. Plot the data and look at it. Then you can hope to build a model. What you may well find is that a certain temperature difference leads to a certain percentage of time the AC is on and the cost is just that fraction times the cost of full time running. You are interested in whether the percentage on time is linear in the temperature difference or not.
I strongly suspect that a standard thermostat is optimal. Let it get as hot as acceptable inside, then run the AC until it cools down a bit, turn off the AC and let it warm up again. Running the AC any earlier increases the temperature difference from outside to inside, so will increase the heat leak in, which means you have to run the AC more to reject that extra heat. The thing that will save money is to raise the acceptable indoor temperature. You talk about letting it get to 80 then cooling to 71, but if 80 is acceptable why cool down from that?