Let there be given graph g, initial context c_0, partial information i_p. input: g, c_0, i_p output: contextual knowledge graph c enumerate: Initialize scope from c_0, i_p while scope is not completely fulfilled, do: BFS expansion on c_0 using the unfulfilled scope, filters and i_p, and if context is expanded, update scope, continue; else, apply learning with inquire to expand g (but not context and scope); then retry on success or break on failure. Return the context c, along with the fulfilled scope for direct access of i. We also call the resulting context c the contextual knowledge graph, and the extractable knowledge k_c = Ex(c) the contextual knowledge, obtained by using g, c_0, i_p, filters. Note also fn(i) = fn(k_c), thus the resulting contextual knowledge is sufficient for TM computation. We prove below that this algorithm yields knowledge that is g-bounded complete; it also proves that the algorithm is correct.