Chat now with support
与支持团队交流

Foglight 5.9.5 - Data Model Guide

The Foglight data model Data modeling tutorials Appendix: Groovy scripts Appendix: Internal database schema
acl_class Table acl_entry Table acl_object_identity Table acl_sid Table agent_client_defaults Table agent_config_binder Table agent_dc_manager_schedule_ids Table agent_dc_manager_state Table agent_manager_state Table alarm_alarm Table alarm_annotations Table alarm_loc_msg Table auditing_log Table baseline_config Table baseline_config_properties Table baseline_engine_profile Table baseline_observation_profile Table cartridge_cartridge_relation Table cartridge_components Table cartridge_installed_cartridges Table cartridge_items Table credential_data Table credential_lockbox Table credential_mapping Table credential_mapping_entry Table credential_order Table credential_policy Table current_version Table database_instance_id Table database_version Table derivation_calculation Table derivation_complex_definition Table derivation_definition Table fgl4_migration_agent Table fgl4_migration_data_span Table fgl4_migration_dcm Table fgl4_migration_host Table fgl4_migration_host_mapping Table fgl4_migration_log Table fgl4_migration_server Table incident_affected_objects Table incident_incident Table incident_linked_alarms Table incident_problem_ticket Table incident_problem_tickets Table licensing_licenses Table mgmt_object_size Table mgmt_observation_size Table mgmt_timeslice Table mgmt_timeslice_data_avail Table model_association Table model_property_formula Table model_query_criteria Table obs_binary_* Tables obs_metric_aggregate_* Tables obs_metric_scalar_* Tables obs_string_* Tables pcm_encoded_data Table persistable_config_model Table persistable_script Table persistence_column_mapping Table persistence_db_column Table persistence_db_schema Table persistence_db_table Table persistence_grouping_policy Table persistence_lifecycle Table persistence_lifecycle_period Table persistence_obs_key_purge_age Table persistence_obs_purge Table persistence_obs_purge_age Table persistence_observation_index Table persistence_operation Table persistence_retention_policy Table persistence_rollup_progress Table persistence_rollup_retry Table persistence_storage_config_xml Table persistence_storage_manager Table persistence_timeslice_table Table persistence_topobj_purge_age Table persistence_type_hierarchy Table registry_performance_calendar Table registry_registry_value Table registry_registry_variable Table report_output Table report_schedule Table rule_action_handler Table rule_action_message Table rule_action_registry_reference Table rule_action_variable_reference Table rule_blackout_schedules Table rule_effective_schedules Table rule_expression Table rule_firing_strategy Table rule_messages Table rule_rule Table rule_sev_to_clear_actn_hndlr Table rule_sev_to_fire_actn_hndlr Table rule_severity Table rule_severity_expression Table rule_severity_messages Table schedule_named_schedule Table script_annt Table script_annt_attr Table script_argument Table script_argument_annt Table script_argument_annt_attr Table script_example Table script_return_annt Table script_return_annt_attr Table sec_group Table sec_group_nesting Table sec_group_role_match Table sec_grouprole Table sec_jaas_source Table sec_object Table sec_object_mask Table sec_object_permission Table sec_object_type Table sec_permission Table sec_permission_def Table sec_policy Table sec_resource Table sec_role Table sec_user_alias Table sec_user_obj_permission Table sec_user_res_permission Table sec_usergroup Table sec_userrole Table sec_x_attribute Table sec_x_attribute_value Table tagging_service_mapping Table threshold_bound Table threshold_config Table topology_activity_calendar Table topology_activity_upgrade Table topology_object Table topology_object_history Table topology_property Table topology_property_annotation Table topology_property_history Table topology_property_name Table topology_property_value Table topology_service_state Table topology_type Table topology_type_annotation Table topology_type_history Table upgrade_pending_operations Table wcf_groups_by_cartridges Table wcf_resources Table

The Foglight data model

This section provides an introduction to models and discusses the Foglight data model.

What are models?

In general, models are abstractions that capture the essence of the objects they are supposed to represent. A good model looks and behaves like the real thing, at least in certain ways. If a model were perfect in every respect, it would be indistinguishable from the real thing. Thus, we could pose questions, submit these in some way to the model, and obtain the same, or almost the same, results as we would by doing those things to the real object. If the object under consideration undergoes a change, the model would have to change accordingly in order to faithfully represent that object.

The data model used in the Management Server is constructed to do just that. The data sent to the Management Server changes with time, not only because the measurements on properties change, but because the objects themselves may come and go. So, a data model for use with the Management Server must be designed to accommodate the creation of objects, by placing them in a well-designed model hierarchy. Objects have relationships among themselves, and a good model accounts for those relationships.

To the Management Server, models are collections of related data objects. The totality of data objects in existence at any one time is referred to as the “data model”.

You can examine the dynamic data objects in the data browser (Dashboards > Configuration > Data > Management Server > All Data).

Who needs to know about models?

Knowledge of the data model is beneficial if you are performing one of the tasks listed in the following table.

If the data is not already available from existing agents, create an agent to collect the data and install it on target systems. One way is to create a formatted script that writes to STDOUT, upload the script, build a script agent, and then deploy and activate the agent. Consult Quest’s Professional Services organization.
NOTE: You can use any executable on the client system that writes to STDOUT. All you need is a script to launch the application.

Modeling process overview

In this guide, modeling refers to the process of creating new in-memory models. This process can include the creation of:

The modeling process is illustrated in the following diagram.

You develop models to organize and convey the relationships that exist among the pieces of monitoring data, so that they can be presented visually in dashboards.

The first step in the modeling process is to define the static types that are used to represent the data. These form the core data types of the data model. Once these types are defined, it is possible to represent them in a schema, and these types can be seen in the Schema Browser dashboard.

You define types by creating a topology type XML file. You can deploy the XML file to the Management Server using the administration UI. The Management Server handles versioning of the types.

The second step in the modeling process is to define instances of those types. Currently, you can only create instances of types using Groovy scripts.

The third step in the modeling process is to create dashboards using WCF. You can perform dashboard creation with just the types deployed, as long as the type definitions include metric definitions. However, it is difficult to test the dashboards without instances.

Once you have built dashboards and they are operating on test data, it is time to start using agents and enable transformations (fourth step in the modeling process). The transformations translate collected data into the topology object instances specified in the second step.

Ideally, the final-form agent is feature complete and collecting data when you reach this step.

自助服务工具
知识库
通知和警报
产品支持
下载软件
技术说明文件
用户论坛
视频教程
联系我们
获得许可 帮助
技术支持
查看全部
相关文档