thermal: consider emul_temperature while computing trend
In case emulated temperature is in use, using the trend provided by driver layer can lead to bogus situation. In this case, debugger user would set a temperature value, but the trend would be from driver computation. To avoid this situation, this patch changes the get_tz_trend() to consider the emulated temperature whenever that is in use. Cc: Zhang Rui <rui.zhang@intel.com> Cc: Amit Daniel Kachhap <amit.daniel@samsung.com> Cc: Durgadoss R <durgadoss.r@intel.com> Cc: linux-pm@vger.kernel.org Cc: linux-kernel@vger.kernel.org Signed-off-by: Eduardo Valentin <eduardo.valentin@ti.com> Signed-off-by: Zhang Rui <rui.zhang@intel.com>
This commit is contained in:
parent
24c7a38172
commit
0c872507d8
@ -155,7 +155,8 @@ int get_tz_trend(struct thermal_zone_device *tz, int trip)
|
||||
{
|
||||
enum thermal_trend trend;
|
||||
|
||||
if (!tz->ops->get_trend || tz->ops->get_trend(tz, trip, &trend)) {
|
||||
if (tz->emul_temperature || !tz->ops->get_trend ||
|
||||
tz->ops->get_trend(tz, trip, &trend)) {
|
||||
if (tz->temperature > tz->last_temperature)
|
||||
trend = THERMAL_TREND_RAISING;
|
||||
else if (tz->temperature < tz->last_temperature)
|
||||
|
Loading…
Reference in New Issue
Block a user