master
Clemens Klug 2018-05-07 16:12:14 +02:00
parent 5e28b9eecc
commit a858231513
4 changed files with 38 additions and 17 deletions

View File

@ -29,50 +29,62 @@ This means only the questions expressable in the query language can be answered.
Additionally, this requires the users to master the query language before any resonable conclusions can be extracted. Additionally, this requires the users to master the query language before any resonable conclusions can be extracted.
By building a custom plugin, extension, or modified version, it is possible to circumvent this obstacle. By building a custom plugin, extension, or modified version, it is possible to circumvent this obstacle.
However, the fast-paced environment of the industry either requires a constant effort of keeping pace, or results in an outdated system rather quickly.\footnote{E.g. the next major release Kibana v6.0.0\footnote{\url{https://github.com/elastic/kibana/releases/tag/v6.0.0}} was released about a year after Kibana v5.0.0\footnote{\url{https://github.com/elastic/kibana/releases/tag/v5.0.0}}. However, the previous major version seems to receive updates for about an year, too.} However, the fast-paced environment of the industry either requires a constant effort of keeping pace, or results in an outdated system rather quickly. (E.g. the next major release Kibana v6.0.0\footnote{\url{https://github.com/elastic/kibana/releases/tag/v6.0.0}} was released about a year after Kibana v5.0.0\footnote{\url{https://github.com/elastic/kibana/releases/tag/v5.0.0}}. However, the previous major version seems to receive updates for about an year, too.)
\image{\textwidth}{../../PresTeX/images/kibana}{Game trace in Kibana}{img:kibana}
\image{\textwidth}{../../PresTeX/images/kibana2}{Game trace in Kibana}{img:kibana2}
\subsection{Evaluation Grafana} \subsection{Evaluation Grafana}
\image{\textwidth}{grafana-metrics}{Configuring a graph in Grafana}{img:graphana} Grafana is a solution to analyze, explore, and visualize various source of time series data.
There exist plugins for nearly any storage and collection backend for metrics\furl{https://grafana.com/plugins?type=datasource}.
The different backends are available through a unified user interface shown in \autoref{img:grafana}.
Spatial resolution suffers under the same conditions compared to Kibana, with an even lower resolution. %TODO: bild
\image{\textwidth}{grafana-metrics}{Configuring a graph in Grafana}{img:grafana}
\subsection{Conclusion} \subsection{Conclusion}
After all, the monitoring solutions are no perfect match for this special use case. After all, the monitoring solutions are no perfect match for this special use case.
The privacy concerns vital in web monitoring prohibit detailed spatial analyzes, the query languages can restrict some questions, and custom extensions require constant integration effort. The privacy concerns vital in web monitoring prohibit detailed spatial analyzes, the query languages can restrict some questions, and custom extensions require constant integration effort.
Regarding the specified use cases, expecially the non-expert users benefit from a simple to use interface. Regarding the specified use cases, expecially the non-expert users benefit from a simple to use interface.
The default Kibana worchbench does not qualify for this, a custom interface could improve the situation. The default Kibana worchbench does not qualify for this, a custom interface could improve the situation.
Grafana does have support for shared dashboards with a fixed set of data, %TODO
\image{\textwidth}{../../PresTeX/images/kibana}{Game trace in Kibana}{img:kibana}
\image{\textwidth}{../../PresTeX/images/kibana2}{Game trace in Kibana}{img:kibana2}
\section{Architectural Design} \section{Architectural Design}
%TODO: map-reduce
\subsection{Overview} \subsection{Overview}
Based on the learnings above, a custom stack seems like a feasible alternative: While the development of a custom stack requires a lot of infrastructural work to get the project running, the learnings above give points to build a custom solution as a feasible alternative:
\begin{itemize} \begin{itemize}
\item Developing from buttom-up takes less time than diving into complex turn-key monitoring solutions. \item Developing from buttom-up takes less time than diving into complex turn-key monitoring solutions.
\item With rather limited amounts of data \footnote{TODO} %TODO \item With rather limited amounts of data\footnote{From a sample of 436 game logs from BioDiv2go, an average log file is 800 kB in size, with a median of 702 kB}, scalable solutions are no hard requirement
scalable solutions are no hard requirement
\item No core dependecies on fast-paced projects \item No core dependecies on fast-paced projects
\item Interfaces tailored on requirements: Simple web interface for non-expert users, CLI and API for researchers with unrestricted possibilities. \item Interfaces tailored on requirements: Simple web interface for non-expert users, CLI and API for researchers with unrestricted possibilities.
\item A focus on key points allows simple, easily extendable interfaces and implementations.
\item Reducing the complexity to an overseeable level, the processes and results can be verified for accuracy and reliability.
\end{itemize} \end{itemize}
On the other hand, building such a solution means a lot of infrastructural work to get the project running. With the requirements from \autoref{sec:require} and the learnings from log processing evaluations in mind, a first architectural approach is visualized in \autoref{img:solution}.
It outlines three main components of the project: Two user facing services (Web \& CLI / API), and an analysis framework.
\autoref{img:solution} outlines the three main components of the project. The interfaces (Web and CLI/API) for both target groups (see \autoref{sec:require}) are completely dependent on the analysis framework at the core.
The already mentioned interfaces (Web and CLI/API) for both target groups are thereby dependent on the third component at the core:
The analysis framework.
\image{\textwidth}{solution.pdf}{Architecture approach}{img:solution} \image{\textwidth}{solution.pdf}{Architecture approach}{img:solution}
The following sections describe each of those components. The following sections describe each of those components.
\subsection{Analysis Framework} \subsection{Analysis Framework}
The analysis framework takes game logs, processes their entries, collects results, and renders them to an output. The analysis framework takes game logs, processes their entries, collects results, and renders them to an output.
With a Map-Reduce pattern as basic structure for the data flow, an ordered collection of analyzing, matching prostprocessing and render operations defines an analysis run.
\autoref{img:flow} shows the data flows through the framework. \autoref{img:flow} shows the data flows through the framework.
Every processed log file has its own analysis chain. Every processed log file has its own analysis chain.
The log entries are fed sequentially into the analysis chain. The log entries are fed sequentially into the analysis chain.
\image{\textwidth}{map-reduce.pdf}{Data flows}{img:flow} \image{\textwidth}{map-reduce.pdf}{Data flows}{img:flow}
\subsubsection{Analyzer} \subsubsection{Analyzer}
An Analyzer takes one log entry at a time. An Analyzer takes one log entry at a time and processes it.
With dynamic selectors stored in settings, Analyzers can be used on multiple game types.
For specific needs, Analyzers can tailored to a specific game, too.
While processing, the Analyzer can choose to read, manipulate, or consume the log entry. While processing, the Analyzer can choose to read, manipulate, or consume the log entry.
\paragraph{Reading a log entry} \paragraph{Reading a log entry}
Every Analyzer can read all of the log entry's contents. Every Analyzer can read all of the log entry's contents.
@ -101,6 +113,12 @@ A whole range from static plots and CSV exports to structured JSON data for inte
\subsubsection{Log parser} \subsubsection{Log parser}
Key to the framework described above is a component to import game log data, parse it, and prepare it to a common format for processing. Key to the framework described above is a component to import game log data, parse it, and prepare it to a common format for processing.
This needs to be adapted for each supported game.
It has to know where game logs are stored and how they can be accessed.
Configurable items like URLs allow e.g. different game servers.
The important step is the parsing of game logs from the formats used by the games (e.g. JSON, XML, plain text, database, …).
\subsection{Web Interface} \subsection{Web Interface}
The web interface is rather straightforward: The web interface is rather straightforward:
Expert users prepare a set of analysis methods and bundle them with suitable rendering targets to an analysis suit. Expert users prepare a set of analysis methods and bundle them with suitable rendering targets to an analysis suit.

View File

Before

Width:  |  Height:  |  Size: 131 KiB

After

Width:  |  Height:  |  Size: 131 KiB

View File

@ -3,7 +3,7 @@ digraph{
{ {
//rank=same; //rank=same;
s [label="Web Interface"]; s [label="Web Interface"];
a [label="Analysis"]; a [label="Analysis Framework"];
c [label="CLI / API"]; c [label="CLI / API"];
} }
s -> a; s -> a;

View File

@ -178,7 +178,6 @@ Abgabedatum:\> \@date\\
\end{figure} \end{figure}
} }
%#1 Datei (liegt im graphic Verzeichnis) %#1 Datei (liegt im graphic Verzeichnis)
%#2 Beschriftung %#2 Beschriftung
%#3 Label fuer Referenzierung %#3 Label fuer Referenzierung
@ -451,3 +450,7 @@ major line width/.initial=1pt,
\changemenucolor{gray}{br}{named}{unibablueI} \changemenucolor{gray}{br}{named}{unibablueI}
\changemenucolor{gray}{txt}{named}{unibablueI} \changemenucolor{gray}{txt}{named}{unibablueI}
\fi \fi
\newcommand{\furl}[1]{
\footnote{\url{#1}}
}