Unsolved
This post is more than 5 years old
1 Rookie
•
17 Posts
0
640
February 2nd, 2015 06:00
SRM does not discover Hitachi AMS2100
I've installed SRM in our lab, we have a NetApp and HDS AMS2100 array.
The NetApp array is discovered but the AMS2100 is not. The Hitachi Device Manager server shows successful connection in Administration - Discovery Center but any report does not showan HDS array.
Querying metrics using the wizard only list Netapp for array type so SRM doesn't appear to be recording anything for Hitachi storage.
Looking at the Collector Manager logs, the only errors I can see are below. Anyone have any ideas what is happeneing?
WARNING -- [2015-02-02 13:48:00 GMT] -- HTTPClient::getErrorInfo(): Server reports problem:
WARNING -- [2015-02-02 13:48:00 GMT] -- AccessorManager::fetchDataFromAccessors(): Could not fetch data from one of the accessor. Backup data will be used instead of the data from the accesssors' PTF file 'usedby.csv'.
com.watch4net.apg.v2.collector.plugins.propertytaggingfilter.accessor.AccessorException: Error querying data
at com.watch4net.apg.v2.collector.plugins.propertytaggingfilter.accessor.SparqlAccessor.writeData(SparqlAccessor.java:157)
at com.watch4net.apg.v2.collector.plugins.propertytaggingfilter.accessor.AccessorManager.fetchDataFromAccessors(AccessorManager.java:219)
at com.watch4net.apg.v2.collector.plugins.propertytaggingfilter.PropertyTagger.isNeedReload(PropertyTagger.java:424)
at com.watch4net.apg.v2.collector.plugins.propertytaggingfilter.PropertyTagger.reload(PropertyTagger.java:487)
at com.watch4net.apg.v2.collector.plugins.propertytaggingfilter.PropertyTagger$TimedRefresh.run(PropertyTagger.java:456)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.openrdf.repository.http.HTTPQueryEvaluationException:
at org.openrdf.repository.http.HTTPTupleQuery.evaluate(HTTPTupleQuery.java:48)
at com.watch4net.apg.v2.collector.plugins.propertytaggingfilter.accessor.SparqlAccessor.writeData(SparqlAccessor.java:132)
... 11 more
Caused by: org.openrdf.repository.RepositoryException:
at org.openrdf.http.client.HTTPClient.getTupleQueryResult(HTTPClient.java:1170)
at org.openrdf.http.client.HTTPClient.sendTupleQuery(HTTPClient.java:447)
at org.openrdf.http.client.HTTPClient.sendTupleQuery(HTTPClient.java:422)
at org.openrdf.repository.http.HTTPTupleQuery.evaluate(HTTPTupleQuery.java:41)
... 12 more
INFO -- [2015-02-02 13:48:00 GMT] -- AccessorManager::fetchDataFromAccessors(): A total of 4 lines were written to PTF file 'actdisc.csv' in 28 ms.
INFO -- [2015-02-02 13:48:00 GMT] -- AccessorManager::fetchDataFromAccessors(): A total of 56 lines were written to PTF file 'sensor-enrichment.csv' in 2 ms.
INFO -- [2015-02-02 13:48:00 GMT] -- AccessorManager::fetchDataFromAccessors(): A total of 6 lines were written to PTF file 'zero-aggregated-metrics.csv' in 1 ms.
INFO -- [2015-02-02 13:48:00 GMT] -- PropertyTagger::reload(): Reloading configuration.
garyan
42 Posts
1
February 2nd, 2015 06:00
Hi Wayne,
The error above indicates that the metadata to compute the usedby property is not being built successfully - one of the properties that the LUN level metrics are tagged with. But this should NOT stop the metrics from getting into the database. It seems to me that there is something else going on.
I would request you to submit a SR (Service Request) so we can help you triage this.
-Gayatri
wtg990
1 Rookie
•
17 Posts
0
February 2nd, 2015 07:00
OK, I'll do that.
The one thing I've noticed about the HDvM install is that it is missing the Storage Navigator Modular and is installed on the D drive. Not sure if that matters but I'll raise a case