A simpler way to do the calculation is to just look up the number for the average kWh per square meter of panel area per day using the online database I linked to ( 3.91 in my example) and multiply it with the nominal panel power (for example 30W) and this gives the number of Watthours that the panel will provide. This way you don't need to worry about panel area or panel efficiency. It works because the nominal panel power 30W is measured under standard test conditions which is 1000W/m2 and with panel temperature +25C.
Note that there are some variations in real life that can greatly affect the accuracy of the calculations. For example, the number of watt hours is affected by panel temperature. Usually a reduction of 0.45% per degree of temperature rise above +25C. so if the panel is located in a hot climate, you will get less solar energy from the panel than expected. On the other hand, if you are in a cold climate you will get more.
Power losses in cables and in the charge controller, tracking efficiency of the charge controller and the charge efficiency of the battery will reduce the number of watt hours that is available for powering the load.
a good charge controller uses Maximum Power Point Tracking to utilize as much as possible of available panel power. a sophisticated charger such as the LT8490 has a tracking efficiency around 99% and a power stage efficiency of 98%. cable losses will depend on wire size and length. battery charge efficiency is about 90% for a sealed lead acid battery. when you add up these losses you end up with the number 85% that I used in my calculation. I did not take panel temperature into account.
if you do not use a charger that has maximum power point tracking you will get much less power from the same panel. the most common type of charge controller just connects the panel to the battery and disconnects when battery voltage rises too high. then it cycles on and off. this wastes a lot of the available power because the optimum panel voltage is almost never the same as the battery voltage. you have to look at the power-voltage curve of a solar panel to see what I am talking about. the panel voltage variations with temperature also causes this cheap type of charger to have a really bad tracking efficiency. you may not worry about that if the solar panel is only used occasionally to charge a boat battery, but for a system that has to run continuously year around you need the more sophisticated charger type (MPPT). another advantage of the type of charger built with for example LT8490 controller is that the panel voltage does not have to be the same as the battery voltage. you can use a 36 cell ("12V") panel or a 48 cell or 60 cell or 72 cell panel to charge the same 12V battery. for northern locations you may want to consider a 200W panel or larger, and then you get more Watt per dollar with a 60 cell solar panel of the type that is used in large installations.
I think it makes sense to use 12V battery voltage for a system that powers the Edison by solar panel. The on board Li-Ion charger is not made for handling the maximum power from the panel. If the panel is delivering 30W or more you need a much bigger battery and a bigger charger..
Note that a battery must be used with a solar panel, to store energy of course, but also to avoid the large voltage variations that a solar panel gives. if you don't draw any current from the panel the voltage will rise to the open circuit voltage which can be high. a 36 cell panel can deliver up to 27V if it is cold.
it is also important to know that you cannot just attach a switching charge regulator to a solar panel and think that it will work. the solar panel is a current source, not a voltage source, and if you just connect a switching regulator ( of the type that is plugged into a cigarette lighter socket) to drop the voltage to 5V, the panel voltage can start collapsing and the system starts and stops continuously. the switcher must be specifically designed to operate with a solar panel.