Hi we are using elastic on edx analytics and while running moduleengagementworkflowtask. I am very new to elasticsearch. can anyone help me with the below error.
2018-04-07 10:01:36,862 INFO 80774 [luigi-interface] worker.py:282 - [pid 80774] Worker Worker(salt=248926547, host=analytics.lithan.com
, username=hadoop, pid=80774) running ModuleEngagementRosterIndexTask(source=('hdfs://localhost:9000/data/',), expand_interval=2 days,
0:00:00, pattern=('.tracking.log.',), date_pattern=%Y%m%d, warehouse_path=hdfs://localhost:9000/edx-analytics-pipeline/warehouse/, ho
st=('http://127.0.0.1:9200/',), date=2018-04-07, obfuscate=False, scale_factor=1, alias=roster, number_of_shards=3)
2018-04-07 10:01:43,643 ERROR 80774 [edx.analytics.tasks.common.elasticsearch_load] elasticsearch_load.py:409 - Unable to rollback the e
lasticsearch load.
Traceback (most recent call last):
File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/edx/analytics/tasks/common/elasticsearch_load.py",
line 406, in rollback
if elasticsearch_client.indices.exists(index=self.index):
File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/elasticsearch/client/utils.py", line 69, in wrapp
ed
return func(*args, params=params, **kwargs)
File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/elasticsearch/client/indices.py", line 224, in exi
sts
params=params)
File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/elasticsearch/transport.py", line 307, in perform
request
status, headers, data = connection.perform_request(method, url, params, body, ignore=ignore, timeout=timeout)
File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/elasticsearch/connection/http_urllib3.py", line 89
, in perform_request
raise ConnectionError('N/A', str(e), e)
ConnectionError: ConnectionError(<urllib3.connection.HTTPConnection object at 0x7f84ec9e9050>: Failed to establish a new connection: [Er
rno 111] Connection refused) caused by: NewConnectionError(<urllib3.connection.HTTPConnection object at 0x7f84ec9e9050>: Failed to estab
lish a new connection: [Errno 111] Connection refused)
I cannot connect to curl -XGET 'http://localhost:9200/_status'
sudo curl http://localhost:9200
curl: (7) Failed to connect to localhost port 9200: Connection refused
sudo vi /etc/elasticsearch/elasticsearch.yml
Set the bind address specifically (IPv4 or IPv6):
#network.bind_host: 192.168.0.1
The service is running on Ubuntu 16.04
lithanr@analytics:~$ sudo service elasticsearch status
● elasticsearch.service - LSB: Starts elasticsearch
Loaded: loaded (/etc/init.d/elasticsearch; bad; vendor preset: enabled)
Active: active (exited) since Sat 2018-04-07 13:25:20 +08; 18min ago
Docs: man:systemd-sysv-generator(8)
Process: 49138 ExecStop=/etc/init.d/elasticsearch stop (code=exited, status=0/SUCCESS)
Process: 49188 ExecStart=/etc/init.d/elasticsearch start (code=exited, status=0/SUCCESS)
Apr 07 13:25:20 analytics.lithan.com systemd[1]: Stopped LSB: Starts elasticsearch.
Apr 07 13:25:20 analytics.lithan.com systemd[1]: Starting LSB: Starts elasticsearch...
Apr 07 13:25:20 analytics.lithan.com systemd[1]: Started LSB: Starts elasticsearch.
If I check the logs in /var/log/elasticsearch the logs are not getting updated. Can anybody help me enable the logs.
This is my /etc/elasticsearch/logging.yml
sudo vi /etc/elasticsearch/logging.yml
you can override this using by setting a system property, for example -Des.logger.level=DEBUG
es.logger.level: INFO
rootLogger: ${es.logger.level}, console, file
logger:
log action execution errors for easier debugging
action: DEBUG
reduce the logging for aws, too much is logged under the default INFO
com.amazonaws: WARN
org.apache.http: INFO
gateway
#gateway: DEBUG
#index.gateway: DEBUG
peer shard recovery
#indices.recovery: DEBUG
discovery
#discovery: TRACE
index.search.slowlog: TRACE, index_search_slow_log_file
index.indexing.slowlog: TRACE, index_indexing_slow_log_file
additivity:
index.search.slowlog: false
index.indexing.slowlog: false
appender:
console:
type: console
layout:
type: consolePattern
conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"
file:
type: dailyRollingFile
file: ${path.logs}/${cluster.name}.log
datePattern: "'.'yyyy-MM-dd"
layout:
type: pattern
conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"
Use the following log4j-extras RollingFileAppender to enable gzip compression of log files.
For more information see https://logging.apache.org/log4j/extras/apidocs/org/apache/log4j/rolling/RollingFileAppender.html
#file:
#type: extrasRollingFile
#file: ${path.logs}/${cluster.name}.log
#rollingPolicy: timeBased
#rollingPolicy.FileNamePattern: ${path.logs}/${cluster.name}.log.%d{yyyy-MM-dd}.gz
#layout:
#type: pattern
#conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"
index_search_slow_log_file:
type: dailyRollingFile
file: ${path.logs}/${cluster.name}_index_search_slowlog.log
datePattern: "'.'yyyy-MM-dd"
layout:
type: pattern
conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"
index_indexing_slow_log_file:
type: dailyRollingFile
file: ${path.logs}/${cluster.name}_index_indexing_slowlog.log
datePattern: "'.'yyyy-MM-dd"
layout:
type: pattern
conversionPattern: "[%d{ISO8601}][%-5p][%-25c] %m%n"
I first need to enable the logs. can anyone help?