Bias and interpretation of the result of DecisionTreeRegressor and RandomForestRegressor

I am experimenting with the analysis of the performance of different regression methods, and for my dataset (I have 9 predictors and approximately 9000 points) I am observing the following:

  1. Linear Regression type of methods (e.g., Ridge, ElasticNet), seem to give an overall high RMSE and low R^2 score, however the standardized coefficients yield reasonable values accross all of my predictors.
  2. DecisionTreeRegressor and RandomForestRegressor yield lower MSE, and higher R^2 score, however when I look at the feature importance, it is heavily biased towards one of the predictors (e.g., its importance is 0.9).

Can you help with an intuitive explanation of why this happens? Also, how can I interpret the result of DecisionTreeRegressor and RandomForestRegressor? Is there any concept of coefficients in this case? (I have seen a stackoverflow post where the reply is that one should consider feature importance, however in my case that seems to be insufficient).

Thanks in advance,