It seems that I have a strange problem when in my global.asax application in my Application_Start () application I have something that goes into my database, gets all the settings of my application from the name / value table, and then it falls into application through Application.Add(name,value).
I have an “application facade” in another project that is used by my service levels, data layers, etc. to get the settings, I need to make different bits and fragments.
In my database, I have a couple of entries:
ConfigName | ConfigValue
WebServiceUsername | myUsername
WebServicePassword | myPassword
So, in my method, I quit and get these values from the database and put them into my application:
protected void GetApplicationSettings()
{
var appConfigAttributes = ApplicationConfigurationService.GetAppConfigNames();
foreach (var appConfig in appConfigAttributes)
{
Application.Add(appConfig.ConfigName,appConfig.ConfigValue);
}
}
This is how I call the value from the application later:
public static string WebServiceUsername
{
get { return WebConfigurationManager.AppSettings["WebServiceUsername"]; }
}
That's where things get weird.
- :
<%= ApplicationFacade.WebServiceUsername %>
(, ConfigurationManager get!).
...
web.config...
<appSettings>
<add key="putz" value="mash"/>
</appSettings>
ApplicationFacade Putz, (<%= ApplicationFacade.Putz %>), 'mash'.
, , ApplicationFacade . , application_start()?
, <%=Application["WebServiceUsername"]%>, myUsername.
?!
ConfigurationManager.AppSettings.Set(appConfig.ConfigName,appConfig.ConfigValue);